Nonnegative Decomposition of Multivariate Information
Abstract
Of the various attempts to generalize information theory to multiple variables, the most widely utilized, interaction information, suffers from the problem that it is sometimes negative. Here we reconsider from first principles the general structure of the information that a set of sources provides about a given variable. We begin with a new definition of redundancy as the minimum information that any source provides about each possible outcome of the variable, averaged over all possible outcomes. We then show how this measure of redundancy induces a lattice over sets of sources that clarifies the general structure of multivariate information. Finally, we use this redundancy lattice to propose a definition of partial information atoms that exhaustively decompose the Shannon information in a multivariate system in terms of the redundancy between synergies of subsets of the sources. Unlike interaction information, the atoms of our partial information decomposition are never negative and always support a clear interpretation as informational quantities. Our analysis also demonstrates how the negativity of interaction information can be explained by its confounding of redundancy and synergy.
- Publication:
-
arXiv e-prints
- Pub Date:
- April 2010
- DOI:
- 10.48550/arXiv.1004.2515
- arXiv:
- arXiv:1004.2515
- Bibcode:
- 2010arXiv1004.2515W
- Keywords:
-
- Computer Science - Information Theory;
- Mathematical Physics;
- Physics - Biological Physics;
- Physics - Data Analysis;
- Statistics and Probability;
- Quantitative Biology - Neurons and Cognition;
- Quantitative Biology - Quantitative Methods;
- 94A15
- E-Print:
- 14 pages, 9 figures