A Pessimistic Approximation for the Fisher Information Measure
Abstract
The problem of determining the intrinsic quality of a signal processing system with respect to the inference of an unknown deterministic parameter $\theta$ is considered. While the Fisher information measure $F(\theta)$ forms a classical tool for such a problem, direct computation of the information measure can become difficult in various situations. For the estimation theoretic performance analysis of nonlinear measurement systems, the form of the likelihood function can make the calculation of the information measure $F(\theta)$ challenging. In situations where no closedform expression of the statistical system model is available, the analytical derivation of $F(\theta)$ is not possible at all. Based on the CauchySchwarz inequality, we derive an alternative information measure $S(\theta)$. It provides a lower bound on the Fisher information $F(\theta)$ and has the property of being evaluated with the mean, the variance, the skewness and the kurtosis of the system model at hand. These entities usually exhibit good mathematical tractability or can be determined at lowcomplexity by realworld measurements in a calibrated setup. With various examples, we show that $S(\theta)$ provides a good conservative approximation for $F(\theta)$ and outline different estimation theoretic problems where the presented information bound turns out to be useful.
 Publication:

IEEE Transactions on Signal Processing
 Pub Date:
 January 2017
 DOI:
 10.1109/TSP.2016.2617824
 arXiv:
 arXiv:1508.03878
 Bibcode:
 2017ITSP...65..386S
 Keywords:

 Computer Science  Information Theory;
 Electrical Engineering and Systems Science  Signal Processing
 EPrint:
 IEEE Transactions on Signal Processing, vol. 65, no. 2, pp. 386396, 2017