The connections between Lyapunov functions for some optimization algorithms and differential equations
Abstract
In this manuscript, we study the properties of a family of second-order differential equations with damping, its discretizations and their connections with accelerated optimization algorithms for $m$-strongly convex and $L$-smooth functions. In particular, using the Linear Matrix Inequality LMI framework developed by \emph{Fazlyab et. al. $(2018)$}, we derive analytically a (discrete) Lyapunov function for a two-parameter family of Nesterov optimization methods, which allows for the complete characterization of their convergence rate. In the appropriate limit, this family of methods may be seen as a discretization of a family of second-order ordinary differential equations for which we construct(continuous) Lyapunov functions by means of the LMI framework. The continuous Lyapunov functions may alternatively, be obtained by studying the limiting behaviour of their discrete counterparts. Finally, we show that the majority of typical discretizations of the family of ODEs, such as the Heavy ball method, do not possess Lyapunov functions with properties similar to those of the Lyapunov function constructed here for the Nesterov method.
- Publication:
-
arXiv e-prints
- Pub Date:
- September 2020
- DOI:
- 10.48550/arXiv.2009.00673
- arXiv:
- arXiv:2009.00673
- Bibcode:
- 2020arXiv200900673S
- Keywords:
-
- Mathematics - Numerical Analysis;
- Computer Science - Machine Learning;
- Mathematics - Optimization and Control;
- 65L06;
- 65L20;
- 90C25;
- 93C15
- E-Print:
- 21 pages, 1 figure