Reduced Perplexity: A simplified perspective on assessing probabilistic forecasts
Abstract
A simple, intuitive approach to the assessment of probabilistic inferences is introduced. The Shannon information metrics are translated to the probability domain. The translation shows that the negative logarithmic score and the geometric mean are equivalent measures of the accuracy of a probabilistic inference. Thus there is both a quantitative reduction in perplexity, which is the inverse of the geometric mean of the probabilities, as good inference algorithms reduce the uncertainty and a qualitative reduction due to the increased clarity between the original set of probabilistic forecasts and their central tendency, the geometric mean. Further insight is provided by showing that the Rényi and Tsallis entropy functions translated to the probability domain are both the weighted generalized mean of the distribution. The generalized mean of probabilistic forecasts forms a spectrum of performance metrics referred to as a Risk Profile. The arithmetic mean is used to measure the decisiveness, while the -2/3 mean is used to measure the robustness.
- Publication:
-
arXiv e-prints
- Pub Date:
- March 2016
- DOI:
- 10.48550/arXiv.1603.08830
- arXiv:
- arXiv:1603.08830
- Bibcode:
- 2016arXiv160308830N
- Keywords:
-
- Statistics - Other Statistics
- E-Print:
- 21 pages, 5 figures, conference paper presented at Recent Advances in Info-Metrics, Washington, DC, 2014. Accepted for a book chapter in "Recent innovations in info-metrics: a cross-disciplinary perspective on information and information processing" by Oxford University Press