Proximal Splitting Algorithms for Convex Optimization: A Tour of Recent Advances, with New Twists
Abstract
Convex nonsmooth optimization problems, whose solutions live in very high dimensional spaces, have become ubiquitous. To solve them, the class of firstorder algorithms known as proximal splitting algorithms is particularly adequate: they consist of simple operations, handling the terms in the objective function separately. In this overview, we demystify a selection of recent proximal splitting algorithms: we present them within a unified framework, which consists in applying splitting methods for monotone inclusions in primaldual product spaces, with wellchosen metrics. Along the way, we easily derive new variants of the algorithms and revisit existing convergence results, extending the parameter ranges in several cases. In particular, we emphasize that when the smooth term in the objective function is quadratic, e.g., for leastsquares problems, convergence is guaranteed with larger values of the relaxation parameter than previously known. Such larger values are usually beneficial for the convergence speed in practice.
 Publication:

arXiv eprints
 Pub Date:
 November 2019
 DOI:
 10.48550/arXiv.1912.00137
 arXiv:
 arXiv:1912.00137
 Bibcode:
 2019arXiv191200137C
 Keywords:

 Mathematics  Optimization and Control;
 90C25;
 90C30;
 90C06;
 47J25;
 47J26;
 68W15;
 65K05
 EPrint:
 To appear in SIAM Review