Maximal Information Divergence from Statistical Models defined by Neural Networks
Abstract
We review recent results about the maximal values of the KullbackLeibler information divergence from statistical models defined by neural networks, including naive Bayes models, restricted Boltzmann machines, deep belief networks, and various classes of exponential families. We illustrate approaches to compute the maximal divergence from a given model starting from simple sub or supermodels. We give a new result for deep and narrow belief networks with finitevalued units.
 Publication:

arXiv eprints
 Pub Date:
 March 2013
 DOI:
 10.48550/arXiv.1303.0268
 arXiv:
 arXiv:1303.0268
 Bibcode:
 2013arXiv1303.0268M
 Keywords:

 Mathematics  Statistics Theory;
 Statistics  Machine Learning;
 62E17;
 94A17;
 60E05
 EPrint:
 8 pages, 1 figure