Estimating information amount under uncertainty: algorithmic solvability and computational complexity
Measurement results (and, more generally, estimates) are never absolutely accurate: there is always an uncertainty, the actual value x is, in general, different from the estimate ? Sometimes, we know the probability of different values of the estimation error ?, sometimes, we only know the interval of possible values of ?, , sometimes, we have interval bounds on the cumulative distribution function of ?. To compare different measuring instruments, it is desirable to know which of them brings more information - i.e. it is desirable to gauge the amount of information. For probabilistic uncertainty, this amount of information is described by Shannon's entropy; similar measures can be developed for interval and other types of uncertainty. In this paper, we analyse the computational complexity of the problem of estimating information amount under different types of uncertainty.