Causation entropy identifies indirect influences, dominance of neighbors and anticipatory couplings
Abstract
Inference of causality is central in nonlinear time series analysis and science in general. A popular approach to infer causality between two processes is to measure the information flow between them in terms of transfer entropy. Using dynamics of coupled oscillator networks, we show that although transfer entropy can successfully detect information flow in two processes, it often results in erroneous identification of network connections under the presence of indirect interactions, dominance of neighbors, or anticipatory couplings. Such effects are found to be profound for time-dependent networks. To overcome these limitations, we develop a measure called causation entropy and show that its application can lead to reliable identification of true couplings.
- Publication:
-
Physica D Nonlinear Phenomena
- Pub Date:
- January 2014
- DOI:
- 10.1016/j.physd.2013.07.001
- arXiv:
- arXiv:1504.03769
- Bibcode:
- 2014PhyD..267...49S
- Keywords:
-
- Nonlinear Sciences - Chaotic Dynamics;
- Condensed Matter - Statistical Mechanics;
- Mathematics - Dynamical Systems
- E-Print:
- Physica D 267, 49--57 (2014)