We implement the adaptive step size scheme from the optimization methods AdaGrad and Adam in a novel variant of the Proximal Gradient Method (PGM). Our algorithm, dubbed AdaProx, avoids the need for explicit computation of the Lipschitz constants or additional line searches and thus reduces per-iteration cost. In test cases for Constrained Matrix Factorization we demonstrate the advantages of AdaProx in fidelity and performance over PGM, while still allowing for arbitrary penalty functions. The python implementation of the algorithm presented here is available as an open-source package at https://github.com/pmelchior/proxmin.
- Pub Date:
- October 2019
- Mathematics - Optimization and Control;
- Astrophysics - Instrumentation and Methods for Astrophysics;
- Electrical Engineering and Systems Science - Image and Video Processing
- 7 pages, 5 figures