Mutual information in random Boolean models of regulatory networks
Abstract
The amount of mutual information contained in the time series of two elements gives a measure of how well their activities are coordinated. In a large, complex network of interacting elements, such as a genetic regulatory network within a cell, the average of the mutual information over all pairs, ⟨I⟩ , is a global measure of how well the system can coordinate its internal dynamics. We study this average pairwise mutual information in random Boolean networks (RBNs) as a function of the distribution of Boolean rules implemented at each element, assuming that the links in the network are randomly placed. Efficient numerical methods for calculating ⟨I⟩ show that as the number of network nodes, N , approaches infinity, the quantity N⟨I⟩ exhibits a discontinuity at parameter values corresponding to critical RBNs. For finite systems it peaks near the critical value, but slightly in the disordered regime for typical parameter variations. The source of high values of N⟨I⟩ is the indirect correlations between pairs of elements from different long chains with a common starting point. The contribution from pairs that are directly linked approaches zero for critical networks and peaks deep in the disordered regime.
- Publication:
-
Physical Review E
- Pub Date:
- January 2008
- DOI:
- 10.1103/PhysRevE.77.011901
- arXiv:
- arXiv:0707.3642
- Bibcode:
- 2008PhRvE..77a1901R
- Keywords:
-
- 87.10.-e;
- 89.75.Fb;
- 02.50.Ng;
- General theory and mathematical aspects;
- Structures and organization in complex systems;
- Distribution theory and Monte Carlo studies;
- Quantitative Biology - Other;
- Quantitative Biology - Quantitative Methods
- E-Print:
- 11 pages, 6 figures