Estimating information amount under uncertainty: algorithmic solvability and computational complexity
Abstract
Measurement results (and, more generally, estimates) are never absolutely accurate: there is always an uncertainty, the actual value x is, in general, different from the estimate ? Sometimes, we know the probability of different values of the estimation error ?, sometimes, we only know the interval of possible values of ?, , sometimes, we have interval bounds on the cumulative distribution function of ?. To compare different measuring instruments, it is desirable to know which of them brings more information  i.e. it is desirable to gauge the amount of information. For probabilistic uncertainty, this amount of information is described by Shannon's entropy; similar measures can be developed for interval and other types of uncertainty. In this paper, we analyse the computational complexity of the problem of estimating information amount under different types of uncertainty.
 Publication:

International Journal of General Systems
 Pub Date:
 May 2010
 DOI:
 10.1080/03081071003696025
 Bibcode:
 2010IJGS...39..349K
 Keywords:

 amount of information;
 uncertainty;
 probabilistic uncertainty;
 interval uncertainty;
 entropy;
 computational complexity