Complexity Monotone in Conditions and Future Prediction Errors

Authors Alexey Chernov, Marcus Hutter, Jürgen Schmidhuber

Thumbnail PDF


  • Filesize: 295 kB
  • 20 pages

Document Identifiers

Author Details

Alexey Chernov
Marcus Hutter
Jürgen Schmidhuber

Cite AsGet BibTex

Alexey Chernov, Marcus Hutter, and Jürgen Schmidhuber. Complexity Monotone in Conditions and Future Prediction Errors. In Kolmogorov Complexity and Applications. Dagstuhl Seminar Proceedings, Volume 6051, pp. 1-20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff finitely bounded the total deviation of his universal predictor $M$ from the true distribution $mu$ by the algorithmic complexity of $mu$. Here we assume we are at a time $t>1$ and already observed $x=x_1...x_t$. We bound the future prediction performance on $x_{t+1}x_{t+2}...$ by a new variant of algorithmic complexity of $mu$ given $x$, plus the complexity of the randomness deficiency of $x$. The new complexity is monotone in its condition in the sense that this complexity can only decrease if the condition is prolonged. We also briefly discuss potential generalizations to Bayesian model classes and to classification problems.
  • Kolmogorov complexity
  • posterior bounds
  • online sequential prediction
  • Solomonoff prior
  • monotone conditional complexity
  • total error
  • future loss
  • ra


  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    PDF Downloads