An information upper bound for probability sensitivity
Abstract
Uncertain input of a mathematical model induces uncertainties in the output and probabilistic sensitivity analysis identifies the influential inputs to guide decision-making. Of practical concern is the probability that the output would, or would not, exceed a threshold, and the probability sensitivity depends on this threshold which is often uncertain. The Fisher information and the Kullback-Leibler divergence have been recently proposed in the literature as threshold-independent sensitivity metrics. We present mathematical proof that the information-theoretical metrics provide an upper bound for the probability sensitivity. The proof is elementary, relying only on a special version of the Cauchy-Schwarz inequality called Titu's lemma. Despite various inequalities exist for probabilities, little is known of probability sensitivity bounds and the one proposed here is new to the present authors' knowledge. The probability sensitivity bound is extended, analytically and with numerical examples, to the Fisher information of both the input and output. It thus provides a solid mathematical basis for decision-making based on probabilistic sensitivity metrics.
- Publication:
-
arXiv e-prints
- Pub Date:
- June 2022
- DOI:
- 10.48550/arXiv.2206.02274
- arXiv:
- arXiv:2206.02274
- Bibcode:
- 2022arXiv220602274Y
- Keywords:
-
- Computer Science - Information Theory;
- Mathematics - Numerical Analysis;
- Mathematics - Statistics Theory;
- Statistics - Applications
- E-Print:
- 18 pages, 5 figures, for the datasets generated during and/or analysed during the current study, see http://doi.org/10.5281/zenodo.6615192