Equivalence between belief propagation instability and transition to replica symmetry breaking in perceptron learning systems
Abstract
The binary perceptron is a fundamental model of supervised learning for nonconvex optimization, which is a root of the popular deep learning. The binary perceptron is able to achieve a classification of random high-dimensional data based on the marginal probabilities of binary synapses. The relationship between the belief propagation instability and the equilibrium analysis of the model remains elusive. Here, we establish the relationship by showing that the instability condition around the belief propagation fixed point is identical to the instability for breaking the replica symmetric saddle-point solution of the free-energy function. Therefore our analysis will hopefully provide insight towards other learning systems in bridging the gap between nonconvex learning dynamics and statistical mechanics properties of more complex neural networks.
- Publication:
-
Physical Review Research
- Pub Date:
- April 2022
- DOI:
- 10.1103/PhysRevResearch.4.023023
- arXiv:
- arXiv:2111.13302
- Bibcode:
- 2022PhRvR...4b3023Z
- Keywords:
-
- Condensed Matter - Disordered Systems and Neural Networks;
- Condensed Matter - Statistical Mechanics;
- Statistics - Machine Learning
- E-Print:
- 24 pages, 2 figures, revision to journal