Exponential Reduction in Sample Complexity with Learning of Ising Model Dynamics
Abstract
The usual setting for learning the structure and parameters of a graphical model assumes the availability of independent samples produced from the corresponding multivariate probability distribution. However, for many models the mixing time of the respective Markov chain can be very large and i.i.d. samples may not be obtained. We study the problem of reconstructing binary graphical models from correlated samples produced by a dynamical process, which is natural in many applications. We analyze the sample complexity of two estimators that are based on the interaction screening objective and the conditional likelihood loss. We observe that for samples coming from a dynamical process far from equilibrium, the sample complexity reduces exponentially compared to a dynamical process that mixes quickly.
- Publication:
-
arXiv e-prints
- Pub Date:
- April 2021
- DOI:
- 10.48550/arXiv.2104.00995
- arXiv:
- arXiv:2104.00995
- Bibcode:
- 2021arXiv210400995D
- Keywords:
-
- Computer Science - Machine Learning;
- Condensed Matter - Statistical Mechanics;
- Physics - Data Analysis;
- Statistics and Probability;
- Statistics - Machine Learning
- E-Print:
- Accepted to ICML 2021