SudakovFernique postAMP, and a new proof of the local convexity of the TAP free energy
Abstract
In many problems in modern statistics and machine learning, it is often of interest to establish that a first order method on a nonconvex risk function eventually enters a region of parameter space in which the risk is locally convex. We derive an asymptotic comparison inequality, which we call the SudakovFernique postAMP inequality, which, in a certain class of problems involving a GOE matrix, is able to probe properties of an optimization landscape locally around the iterates of an approximate message passing (AMP) algorithm. As an example of its use, we provide a new, and arguably simpler, proof of some of the results of Celentano et al. (2021), which establishes that the socalled TAP free energy in the $\mathbb{Z}_2$synchronization problem is locally convex in the region to which AMP converges. We further prove a conjecture of El Alaoui et al. (2022) involving the local convexity of a related but distinct TAP free energy, which, as a consequence, confirms that their algorithm efficiently samples from the SherringtonKirkpatrick Gibbs measure throughout the "easy" regime.
 Publication:

arXiv eprints
 Pub Date:
 August 2022
 DOI:
 10.48550/arXiv.2208.09550
 arXiv:
 arXiv:2208.09550
 Bibcode:
 2022arXiv220809550C
 Keywords:

 Mathematics  Probability;
 Computer Science  Machine Learning;
 Mathematics  Statistics Theory