Statistical Inference in MeanField Variational Bayes
Abstract
We conduct nonasymptotic analysis on the meanfield variational inference for approximating posterior distributions in complex Bayesian models that may involve latent variables. We show that the meanfield approximation to the posterior can be wellapproximated relative to the KullbackLeibler divergence discrepancy measure by a normal distribution whose center is the maximum likelihood estimator (MLE). In particular, our results imply that the center of the meanfield approximation matches the MLE up to higherorder terms and there is essentially no loss of efficiency in using it as a point estimator for the parameter in any regular parametric model with latent variables. We also propose a new class of variational weighted likelihood bootstrap (VWLB) methods for quantifying the uncertainty in the meanfield variational inference. The proposed VWLB can be viewed as a new sampling scheme that produces independent samples for approximating the posterior. Comparing with traditional sampling algorithms such Markov Chain Monte Carlo, VWLB can be implemented in parallel and is free of tuning.
 Publication:

arXiv eprints
 Pub Date:
 November 2019
 arXiv:
 arXiv:1911.01525
 Bibcode:
 2019arXiv191101525H
 Keywords:

 Mathematics  Statistics Theory;
 Statistics  Applications;
 Statistics  Computation;
 Statistics  Methodology;
 Statistics  Machine Learning