Maximum Entropy Functions: Approximate GacsKorner for Distributed Compression
Abstract
Consider two correlated sources $X$ and $Y$ generated from a joint distribution $p_{X,Y}$. Their GácsKörner Common Information, a measure of common information that exploits the combinatorial structure of the distribution $p_{X,Y}$, leads to a source decomposition that exhibits the latent common parts in $X$ and $Y$. Using this source decomposition we construct an efficient distributed compression scheme, which can be efficiently used in the network setting as well. Then, we relax the combinatorial conditions on the source distribution, which results in an efficient scheme with a helper node, which can be thought of as a frontend cache. This relaxation leads to an inherent tradeoff between the rate of the helper and the rate reduction at the sources, which we capture by a notion of optimal decomposition. We formulate this as an approximate GácsKörner optimization. We then discuss properties of this optimization, and provide connections with the maximal correlation coefficient, as well as an efficient algorithm, both through the application of spectral graph theory to the induced bipartite graph of $p_{X,Y}$.
 Publication:

arXiv eprints
 Pub Date:
 April 2016
 arXiv:
 arXiv:1604.03877
 Bibcode:
 2016arXiv160403877S
 Keywords:

 Computer Science  Information Theory
 EPrint:
 Submitted to ITW 2016