Characterizing Multivariate Information Flows
Abstract
One of the crucial steps in scientific studies is to specify dependent relationships among factors in a system of interest. Given little knowledge of a system, can we characterize the underlying dependent relationships through observation of its temporal behaviors? In multivariate systems, there are potentially many possible dependent structures confusable with each other, and it may cause false detection of illusory dependency between unrelated factors. The present study proposes a new information-theoretic measure with consideration to such potential multivariate relationships. The proposed measure, called multivariate transfer entropy, is an extension of transfer entropy, a measure of temporal predictability. In the simulations and empirical studies, we demonstrated that the proposed measure characterized the latent dependent relationships in unknown dynamical systems more accurately than its alternative measure.
- Publication:
-
arXiv e-prints
- Pub Date:
- December 2012
- DOI:
- 10.48550/arXiv.1212.5449
- arXiv:
- arXiv:1212.5449
- Bibcode:
- 2012arXiv1212.5449H
- Keywords:
-
- Computer Science - Information Theory;
- Mathematics - Dynamical Systems;
- Statistics - Methodology
- E-Print:
- This manuscript is submitted to Proceedings of the National Academy of Sciences of the United States of America