On the Entropy of Sums of Bernoulli Random Variables via the ChenStein Method
Abstract
This paper considers the entropy of the sum of (possibly dependent and nonidentically distributed) Bernoulli random variables. Upper bounds on the error that follows from an approximation of this entropy by the entropy of a Poisson random variable with the same mean are derived. The derivation of these bounds combines elements of information theory with the ChenStein method for Poisson approximation. The resulting bounds are easy to compute, and their applicability is exemplified. This conference paper presents in part the first half of the paper entitled "An informationtheoretic perspective of the Poisson approximation via the ChenStein method" (see:arxiv:1206.6811). A generalization of the bounds that considers the accuracy of the Poisson approximation for the entropy of a sum of nonnegative, integervalued and bounded random variables is introduced in the full paper. It also derives lower bounds on the total variation distance, relative entropy and other measures that are not considered in this conference paper.
 Publication:

arXiv eprints
 Pub Date:
 July 2012
 arXiv:
 arXiv:1207.0436
 Bibcode:
 2012arXiv1207.0436S
 Keywords:

 Computer Science  Information Theory;
 Mathematics  Probability
 EPrint:
 A conference paper of 5 pages that appears in the Proceedings of the 2012 IEEE International Workshop on Information Theory (ITW 2012), pp. 542546, Lausanne, Switzerland, September 2012