On Model Selection Consistency of Lasso for HighDimensional Ising Models
Abstract
We theoretically analyze the model selection consistency of least absolute shrinkage and selection operator (Lasso), both with and without postthresholding, for highdimensional Ising models. For random regular (RR) graphs of size $p$ with regular node degree $d$ and uniform couplings $\theta_0$, it is rigorously proved that Lasso \textit{without postthresholding} is model selection consistent in the whole paramagnetic phase with the same order of sample complexity $n=\Omega{(d^3\log{p})}$ as that of $\ell_1$regularized logistic regression ($\ell_1$LogR). This result is consistent with the conjecture in Meng, Obuchi, and Kabashima 2021 using the nonrigorous replica method from statistical physics and thus complements it with a rigorous proof. For general treelike graphs, it is demonstrated that the same result as RR graphs can be obtained under mild assumptions of the dependency condition and incoherence condition. Moreover, we provide a rigorous proof of the model selection consistency of Lasso with postthresholding for general treelike graphs in the paramagnetic phase without further assumptions on the dependency and incoherence conditions. Experimental results agree well with our theoretical analysis.
 Publication:

arXiv eprints
 Pub Date:
 October 2021
 DOI:
 10.48550/arXiv.2110.08500
 arXiv:
 arXiv:2110.08500
 Bibcode:
 2021arXiv211008500M
 Keywords:

 Statistics  Machine Learning;
 Computer Science  Machine Learning;
 Mathematics  Statistics Theory
 EPrint:
 AISTATS2023, cameraready version