How to Read Probability Distributions as Statements about Process
Abstract
Probability distributions can be read as simple expressions of information. Each continuous probability distribution describes how information changes with magnitude. Once one learns to read a probability distribution as a measurement scale of information, opportunities arise to understand the processes that generate the commonly observed patterns. Probability expressions may be parsed into four components: the dissipation of all information, except the preservation of average values, taken over the measurement scale that relates changes in observed values to changes in information, and the transformation from the underlying scale on which information dissipates to alternative scales on which probability pattern may be expressed. Information invariances set the commonly observed measurement scales and the relations between them. In particular, a measurement scale for information is defined by its invariance to specific transformations of underlying values into measurable outputs. Essentially all common distributions can be understood within this simple framework of information invariance and measurement scale.
- Publication:
-
Entropy
- Pub Date:
- November 2014
- DOI:
- 10.3390/e16116059
- arXiv:
- arXiv:1409.5196
- Bibcode:
- 2014Entrp..16.6059F
- Keywords:
-
- measurement;
- maximum entropy;
- information theory;
- statistical mechanics;
- extreme value distributions;
- neutral theories in biology;
- Statistics - Other Statistics;
- Condensed Matter - Statistical Mechanics;
- Mathematics - Probability;
- Physics - Data Analysis;
- Statistics and Probability;
- Quantitative Biology - Quantitative Methods
- E-Print:
- v2: added table of contents, adjusted section numbers v3: minor editing, updated reference