Glassy nature of the hard phase in inference problems
Abstract
An algorithmically hard phase was described in a range of inference problems: even if the signal can be reconstructed with a small error from an information theoretic point of view, known algorithms fail unless the noisetosignal ratio is sufficiently small. This hard phase is typically understood as a metastable branch of the dynamical evolution of message passing algorithms. In this work we study the metastable branch for a prototypical inference problem, the lowrank matrix factorization, that presents a hard phase. We show that for noisetosignal ratios that are below the information theoretic threshold, the posterior measure is composed of an exponential number of metastable glassy states and we compute their entropy, called the complexity. We show that this glassiness extends even slightly below the algorithmic threshold below which the wellknown approximate message passing (AMP) algorithm is able to closely reconstruct the signal. Counterintuitively, we find that the performance of the AMP algorithm is not improved by taking into account the glassy nature of the hard phase. This result provides further evidence that the hard phase in inference problems is algorithmically impenetrable for some deep computational reasons that remain to be uncovered.
 Publication:

arXiv eprints
 Pub Date:
 May 2018
 arXiv:
 arXiv:1805.05857
 Bibcode:
 2018arXiv180505857A
 Keywords:

 Condensed Matter  Disordered Systems and Neural Networks;
 Condensed Matter  Statistical Mechanics;
 Computer Science  Information Theory;
 Mathematics  Statistics Theory;
 Statistics  Machine Learning
 EPrint:
 10 pages, 3 figures