Minimum relative entropy distributions with a large mean are Gaussian
Abstract
Entropy optimization principles are versatile tools with wide-ranging applications from statistical physics to engineering to ecology. Here we consider the following constrained problem: Given a prior probability distribution q , find the posterior distribution p minimizing the relative entropy (also known as the Kullback-Leibler divergence) with respect to q under the constraint that mean (p ) is fixed and large. We show that solutions to this problem are approximately Gaussian. We discuss two applications of this result. In the context of dissipative dynamics, the equilibrium distribution of a Brownian particle confined in a strong external field is independent of the shape of the confining potential. We also derive an H -type theorem for evolutionary dynamics: The entropy of the (standardized) distribution of fitness of a population evolving under natural selection is eventually increasing in time.
- Publication:
-
Physical Review E
- Pub Date:
- December 2016
- DOI:
- 10.1103/PhysRevE.94.062107
- arXiv:
- arXiv:1605.08259
- Bibcode:
- 2016PhRvE..94f2107S
- Keywords:
-
- Condensed Matter - Statistical Mechanics;
- Mathematics - Probability;
- Mathematics - Statistics Theory;
- Quantitative Biology - Populations and Evolution
- E-Print:
- 5 pages, 3 figures