On the Entropy of Sums of Bernoulli Random Variables via the Chen-Stein Method
Abstract
This paper considers the entropy of the sum of (possibly dependent and non-identically distributed) Bernoulli random variables. Upper bounds on the error that follows from an approximation of this entropy by the entropy of a Poisson random variable with the same mean are derived. The derivation of these bounds combines elements of information theory with the Chen-Stein method for Poisson approximation. The resulting bounds are easy to compute, and their applicability is exemplified. This conference paper presents in part the first half of the paper entitled "An information-theoretic perspective of the Poisson approximation via the Chen-Stein method" (see:arxiv:1206.6811). A generalization of the bounds that considers the accuracy of the Poisson approximation for the entropy of a sum of non-negative, integer-valued and bounded random variables is introduced in the full paper. It also derives lower bounds on the total variation distance, relative entropy and other measures that are not considered in this conference paper.
- Publication:
-
arXiv e-prints
- Pub Date:
- July 2012
- DOI:
- 10.48550/arXiv.1207.0436
- arXiv:
- arXiv:1207.0436
- Bibcode:
- 2012arXiv1207.0436S
- Keywords:
-
- Computer Science - Information Theory;
- Mathematics - Probability
- E-Print:
- A conference paper of 5 pages that appears in the Proceedings of the 2012 IEEE International Workshop on Information Theory (ITW 2012), pp. 542--546, Lausanne, Switzerland, September 2012