Convergence and Error Bounds for Universal Prediction of Nonbinary Sequences
Abstract
Solomonoff's uncomputable universal prediction scheme $\xi$ allows to predict the next symbol $x_k$ of a sequence $x_1...x_{k1}$ for any Turing computable, but otherwise unknown, probabilistic environment $\mu$. This scheme will be generalized to arbitrary environmental classes, which, among others, allows the construction of computable universal prediction schemes $\xi$. Convergence of $\xi$ to $\mu$ in a conditional mean squared sense and with $\mu$ probability 1 is proven. It is shown that the average number of prediction errors made by the universal $\xi$ scheme rapidly converges to those made by the best possible informed $\mu$ scheme. The schemes, theorems and proofs are given for general finite alphabet, which results in additional complications as compared to the binary case. Several extensions of the presented theory and results are outlined. They include general loss functions and bounds, games of chance, infinite alphabet, partial and delayed prediction, classification, and more active systems.
 Publication:

arXiv eprints
 Pub Date:
 June 2001
 arXiv:
 arXiv:cs/0106036
 Bibcode:
 2001cs........6036H
 Keywords:

 Computer Science  Machine Learning;
 Computer Science  Artificial Intelligence;
 Computer Science  Computational Complexity;
 Mathematics  Probability;
 F.2.3
 EPrint:
 11 LaTeX pages