Curtin University Homepage
  • Library
  • Help
    • Admin

    espace - Curtin’s institutional repository

    JavaScript is disabled for your browser. Some features of this site may not work without it.
    View Item 
    • espace Home
    • espace
    • Curtin Research Publications
    • View Item
    • espace Home
    • espace
    • Curtin Research Publications
    • View Item

    Gradient-free method for nonsmooth distributed optimization

    Access Status
    Fulltext not available
    Authors
    Li, J.
    Wu, Changzhi
    Wu, Z.
    Long, Q.
    Date
    2015
    Type
    Journal Article
    
    Metadata
    Show full item record
    Citation
    Li, J. and Wu, C. and Wu, Z. and Long, Q. 2015. Gradient-free method for nonsmooth distributed optimization. Journal of Global Optimization. 61: pp. 325-340.
    Source Title
    Journal of Global Optimization
    DOI
    10.1007/s10898-014-0174-2
    ISSN
    0925-5001
    School
    Department of Construction Management
    URI
    http://hdl.handle.net/20.500.11937/35457
    Collection
    • Curtin Research Publications
    Abstract

    In this paper, we consider a distributed nonsmooth optimization problem over a computational multi-agent network. We first extend the (centralized) Nesterov’s random gradient-free algorithm and Gaussian smoothing technique to the distributed case. Then, the convergence of the algorithm is proved. Furthermore, an explicit convergence rate is given in terms of the network size and topology. Our proposed method is free of gradient, which may be preferred by practical engineers. Since only the cost function value is required, our method may suffer a factor up to d (the dimension of the agent) in convergence rate over that of the distributed subgradient-based methods in theory. However, our numerical simulations show that for some nonsmooth problems, our method can even achieve better performance than that of subgradient-based methods, which may be caused by the slow convergence in the presence of subgradient.

    Related items

    Showing items related by title, author, creator and subject.

    • Incremental gradient-free method for nonsmooth distributed optimization
      Li, J.; Li, G.; Wu, Z.; Wu, Changzhi; Wang, X.; Lee, J.; Jung, K. (2017)
      In this paper we consider the minimization of the sum of local convex component functions distributed over a multi-agent network. We first extend the Nesterov's random gradient-free method to the incremental setting. Then ...
    • Distributed proximal-gradient methods for convex optimization with inequality constraints
      Li, J.; Wu, Changzhi; Wu, Z.; Long, Q.; Wang, Xiangyu (2014)
      We consider a distributed optimization problem over a multi-agent network, in which the sum of several local convex objective functions is minimized subject to global convex inequality constraints. We first transform the ...
    • Accelerated Sampling Optimization for RF Energy Harvesting Wireless Sensor Network
      Zhao, C.; Chen, S.; Wu, Changzhi; Chen, F.; Ji, Y. (2018)
      © 2013 IEEE. Network utility maximization has been widely adopted to allocate the resource of networks. However, it suffers from slow convergence under distributed computational environment. This paper proposes a fast ...
    Advanced search

    Browse

    Communities & CollectionsIssue DateAuthorTitleSubjectDocument TypeThis CollectionIssue DateAuthorTitleSubjectDocument Type

    My Account

    Admin

    Statistics

    Most Popular ItemsStatistics by CountryMost Popular Authors

    Follow Curtin

    • 
    • 
    • 
    • 
    • 

    CRICOS Provider Code: 00301JABN: 99 143 842 569TEQSA: PRV12158

    Copyright | Disclaimer | Privacy statement | Accessibility

    Curtin would like to pay respect to the Aboriginal and Torres Strait Islander members of our community by acknowledging the traditional owners of the land on which the Perth campus is located, the Whadjuk people of the Nyungar Nation; and on our Kalgoorlie campus, the Wongutha people of the North-Eastern Goldfields.