A ChebyshevAccelerated PrimalDual Method for Distributed Optimization
Abstract
We consider a distributed optimization problem over a network of agents aiming to minimize a global objective function that is the sum of local convex and composite cost functions. To this end, we propose a distributed Chebyshevaccelerated primaldual algorithm to achieve faster ergodic convergence rates. In standard distributed primaldual algorithms, the speed of convergence towards a global optimum (i.e., a saddle point in the corresponding Lagrangian function) is directly influenced by the eigenvalues of the Laplacian matrix representing the communication graph. In this paper, we use Chebyshev matrix polynomials to generate gossip matrices whose spectral properties result in faster convergence speeds, while allowing for a fully distributed implementation. As a result, the proposed algorithm requires fewer gradient updates at the cost of additional rounds of communications between agents. We illustrate the performance of the proposed algorithm in a distributed signal recovery problem. Our simulations show how the use of Chebyshev matrix polynomials can be used to improve the convergence speed of a primaldual algorithm over communication networks, especially in networks with poor spectral properties, by trading local computation by communication rounds.
 Publication:

arXiv eprints
 Pub Date:
 October 2018
 DOI:
 10.48550/arXiv.1810.06713
 arXiv:
 arXiv:1810.06713
 Bibcode:
 2018arXiv181006713S
 Keywords:

 Mathematics  Optimization and Control