Convergence Rates for the MAP of an Exponential Family and Stochastic Mirror Descent  an Open Problem
Abstract
We consider the problem of upper bounding the expected loglikelihood suboptimality of the maximum likelihood estimate (MLE), or a conjugate maximum a posteriori (MAP) for an exponential family, in a nonasymptotic way. Surprisingly, we found no general solution to this problem in the literature. In particular, current theories do not hold for a Gaussian or in the interesting few samples regime. After exhibiting various facets of the problem, we show we can interpret the MAP as running stochastic mirror descent (SMD) on the loglikelihood. However, modern convergence results do not apply for standard examples of the exponential family, highlighting holes in the convergence literature. We believe solving this very fundamental problem may bring progress to both the statistics and optimization communities.
 Publication:

arXiv eprints
 Pub Date:
 November 2021
 DOI:
 10.48550/arXiv.2111.06826
 arXiv:
 arXiv:2111.06826
 Bibcode:
 2021arXiv211106826L
 Keywords:

 Statistics  Machine Learning;
 Computer Science  Machine Learning;
 Mathematics  Statistics Theory
 EPrint:
 9 pages and 3 figures + Appendix