F-Information, a Unitless Variant of Fisher Information
Abstract
A new information matrix [F] with elements Fmn = < (ym - am)(yn - an) (∂ ln p(y | a)/∂am) (∂ ln p(y | a)/∂an) > is analyzed. The PDF p(y | a) is the usual likelihood law. [F] differs from the Fisher information matrix by the presence of the first two factors in the given expectation. These factors make Fmn unitless, in contrast with the Fisher information. This lack of units allows Fmn values from entirely different phenomena to be compared as, for example, Shannon information values can be compared. Each element Fmn defines an error inequality analogous to the Cramer-Rao inequality. In the scalar case Fmn ≡ F, for a normal p(y|a) law F = 3, while for an exponential law F = 9. A variational principle F = min (called FMIN) allows an unknown PDF p(x) to be estimated in the presence of weak information. Under certain conditions F obeys a "Boltzmann F-theorem" ∂F/∂t ⩽ 0, indicating that F is a physical entropy. Finally, the trace ℱ of [F] may be used as the scalar information quantity in an information-based principle for deriving distribution laws p of physics.
- Publication:
-
Foundations of Physics
- Pub Date:
- October 1999
- DOI:
- Bibcode:
- 1999FoPh...29.1521F
- Keywords:
-
- Entropy;
- Variational Principle;
- Fisher Information;
- Information Matrix;
- Scalar Case