Algorithm-independent bounds on complex optimization through the statistics of marginal optima
Abstract
Optimization seeks extremal points in a function. When there are superextensively many optima, optimization algorithms are liable to get stuck. Under these conditions, generic algorithms tend to find marginal optima, which have many nearly flat directions. In a companion paper, we introduce a technique to count marginal optima in random landscapes. Here, we use the statistics of marginal optima calculated using this technique to produce generic bounds on optimization, based on the simple principle that algorithms will overwhelmingly tend to get stuck only where marginal optima are found. We demonstrate the idea on a simple non-Gaussian problem of maximizing the sum of squared random functions on a compact space. Numeric experiments using both gradient descent and generalized approximate message passing algorithms fall inside the expected bounds.
- Publication:
-
arXiv e-prints
- Pub Date:
- July 2024
- DOI:
- 10.48550/arXiv.2407.02092
- arXiv:
- arXiv:2407.02092
- Bibcode:
- 2024arXiv240702092K
- Keywords:
-
- Condensed Matter - Disordered Systems and Neural Networks;
- Condensed Matter - Statistical Mechanics