Enumerable Distributions, Randomness, Dependence
Abstract
Mutual information I in infinite sequences (and in their finite prefixes) is essential in theoretical analysis of many situations. Yet its right definition has been elusive for a long time. I address it by generalizing Kolmogorov Complexity theory from measures to SEMImeasures i.e, infimums of sets of measures. Being concave rather than linear functionals, semimeasures are quite delicate to handle. Yet, they adequately grasp various theoretical and practical scenaria. A simple lower bound i$(\alpha:\beta) = \sup_{x\in N} (K(x)  K(x\alpha)  K(x\beta)) $ for information turns out tight for MartinLof random $ \alpha,\beta $. For all sequences I$(\alpha:\beta) $ is characterized by the minimum of i$(\alpha':\beta') $ over random $ \alpha',\beta' $ with $ U(\alpha')=\alpha, U(\beta')=\beta $.
 Publication:

arXiv eprints
 Pub Date:
 August 2012
 arXiv:
 arXiv:1208.2955
 Bibcode:
 2012arXiv1208.2955L
 Keywords:

 Computer Science  Computational Complexity
 EPrint:
 9 pages