Dagstuhl Seminar Proceedings, Volume 6051



Publication Details

  • published at: 2006-07-31
  • Publisher: Schloss Dagstuhl – Leibniz-Zentrum für Informatik

Access Numbers

Documents

No documents found matching your filter selection.
Document
06051 Abstracts Collection – Kolmogorov Complexity and Applications

Authors: Marcus Hutter, Wolfgang Merkle, and Paul M.B. Vitanyi


Abstract
From 29.01.06 to 03.02.06, the Dagstuhl Seminar 06051 ``Kolmogorov Complexity and Applications'' was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available.

Cite as

Marcus Hutter, Wolfgang Merkle, and Paul M.B. Vitanyi. 06051 Abstracts Collection – Kolmogorov Complexity and Applications. In Kolmogorov Complexity and Applications. Dagstuhl Seminar Proceedings, Volume 6051, pp. 1-17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{hutter_et_al:DagSemProc.06051.1,
  author =	{Hutter, Marcus and Merkle, Wolfgang and Vitanyi, Paul M.B.},
  title =	{{06051 Abstracts Collection – Kolmogorov Complexity and Applications}},
  booktitle =	{Kolmogorov Complexity and Applications},
  pages =	{1--17},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{6051},
  editor =	{Marcus Hutter and Wolfgang Merkle and Paul M.B. Vitanyi},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.06051.1},
  URN =		{urn:nbn:de:0030-drops-6632},
  doi =		{10.4230/DagSemProc.06051.1},
  annote =	{Keywords: Information theory, Kolmogorov Complexity, effective randomnes, algorithmic probability, recursion theory, computational complexity, machine learning knowledge discovery}
}
Document
Application of Kolmogorov complexity and universal codes to identity testing and nonparametric testing of serial independence for time series.

Authors: Boris Ryabko, Jaakko Astola, and Alex Gammerman


Abstract
We show that Kolmogorov complexity and such its estimators as universal codes (or data compression methods) can be applied for hypothesis testing in a framework of classical mathematical statistics. The methods for identity testing and nonparametric testing of serial independence for time series are described.

Cite as

Boris Ryabko, Jaakko Astola, and Alex Gammerman. Application of Kolmogorov complexity and universal codes to identity testing and nonparametric testing of serial independence for time series.. In Kolmogorov Complexity and Applications. Dagstuhl Seminar Proceedings, Volume 6051, pp. 1-13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{ryabko_et_al:DagSemProc.06051.2,
  author =	{Ryabko, Boris and Astola, Jaakko and Gammerman, Alex},
  title =	{{Application of Kolmogorov complexity and universal codes to identity testing and nonparametric testing of serial independence for time series.}},
  booktitle =	{Kolmogorov Complexity and Applications},
  pages =	{1--13},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{6051},
  editor =	{Marcus Hutter and Wolfgang Merkle and Paul M.B. Vitanyi},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.06051.2},
  URN =		{urn:nbn:de:0030-drops-6363},
  doi =		{10.4230/DagSemProc.06051.2},
  annote =	{Keywords: Algorithmic complexity, algorithmic information theory, Kolmogorov complexity, universal coding, hypothesis testing}
}
Document
Automatic Meaning Discovery Using Google

Authors: Rudi Cilibrasi and Paul M.B. Vitanyi


Abstract
We survey a new area of parameter-free similarity distance measures useful in data-mining, pattern recognition, learning and automatic semantics extraction. Given a family of distances on a set of objects, a distance is universal up to a certain precision for that family if it minorizes every distance in the family between every two objects in the set, up to the stated precision (we do not require the universal distance to be an element of the family). We consider similarity distances for two types of objects: literal objects that as such contain all of their meaning, like genomes or books, and names for objects. The latter may have literal embodyments like the first type, but may also be abstract like ``red'' or ``christianity.'' For the first type we consider a family of computable distance measures corresponding to parameters expressing similarity according to particular features between pairs of literal objects. For the second type we consider similarity distances generated by web users corresponding to particular semantic relations between the (names for) the designated objects. For both families we give universal similarity distance measures, incorporating all particular distance measures in the family. In the first case the universal distance is based on compression and in the second case it is based on Google page counts related to search terms. In both cases experiments on a massive scale give evidence of the viability of the approaches.

Cite as

Rudi Cilibrasi and Paul M.B. Vitanyi. Automatic Meaning Discovery Using Google. In Kolmogorov Complexity and Applications. Dagstuhl Seminar Proceedings, Volume 6051, pp. 1-23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{cilibrasi_et_al:DagSemProc.06051.3,
  author =	{Cilibrasi, Rudi and Vitanyi, Paul M.B.},
  title =	{{Automatic Meaning Discovery Using Google}},
  booktitle =	{Kolmogorov Complexity and Applications},
  pages =	{1--23},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{6051},
  editor =	{Marcus Hutter and Wolfgang Merkle and Paul M.B. Vitanyi},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.06051.3},
  URN =		{urn:nbn:de:0030-drops-6296},
  doi =		{10.4230/DagSemProc.06051.3},
  annote =	{Keywords: Normalized Compression Distance, Clustering, Clasification, Relative Semantics of Terms, Google, World-Wide-Web, Kolmogorov complexity}
}
Document
Binary Lambda Calculus and Combinatory Logic

Authors: John Tromp


Abstract
We introduce binary representations of both lambda calculus and combinatory logic terms, and demonstrate their simplicity by providing very compact parser-interpreters for these binary languages. We demonstrate their application to Algorithmic Information Theory with several concrete upper bounds on program-size complexity, including an elegant self-delimiting code for binary strings.

Cite as

John Tromp. Binary Lambda Calculus and Combinatory Logic. In Kolmogorov Complexity and Applications. Dagstuhl Seminar Proceedings, Volume 6051, pp. 1-20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{tromp:DagSemProc.06051.4,
  author =	{Tromp, John},
  title =	{{Binary Lambda Calculus and Combinatory Logic}},
  booktitle =	{Kolmogorov Complexity and Applications},
  pages =	{1--20},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{6051},
  editor =	{Marcus Hutter and Wolfgang Merkle and Paul M.B. Vitanyi},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.06051.4},
  URN =		{urn:nbn:de:0030-drops-6289},
  doi =		{10.4230/DagSemProc.06051.4},
  annote =	{Keywords: Concrete, program size complexity, ambda calculus, combinatory logic, encoding, self-delimiting, binary strings}
}
Document
Combinatorial proof of Muchnik's theorem

Authors: Alexander Shen


Abstract
Original proof of Muchnik's theorem on conditional descriptions can be modified and split into two parts: 1) we construct a graph that allows large online matchings (main part) 2) we use this graph to prove the theorem The question about online matching could be interesting in itself.

Cite as

Alexander Shen. Combinatorial proof of Muchnik's theorem. In Kolmogorov Complexity and Applications. Dagstuhl Seminar Proceedings, Volume 6051, pp. 1-5, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{shen:DagSemProc.06051.5,
  author =	{Shen, Alexander},
  title =	{{Combinatorial proof of Muchnik's theorem}},
  booktitle =	{Kolmogorov Complexity and Applications},
  pages =	{1--5},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{6051},
  editor =	{Marcus Hutter and Wolfgang Merkle and Paul M.B. Vitanyi},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.06051.5},
  URN =		{urn:nbn:de:0030-drops-6258},
  doi =		{10.4230/DagSemProc.06051.5},
  annote =	{Keywords: Matching conditional descriptions Kolmogorov complexity}
}
Document
Complexity Monotone in Conditions and Future Prediction Errors

Authors: Alexey Chernov, Marcus Hutter, and Jürgen Schmidhuber


Abstract
We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff finitely bounded the total deviation of his universal predictor $M$ from the true distribution $mu$ by the algorithmic complexity of $mu$. Here we assume we are at a time $t>1$ and already observed $x=x_1...x_t$. We bound the future prediction performance on $x_{t+1}x_{t+2}...$ by a new variant of algorithmic complexity of $mu$ given $x$, plus the complexity of the randomness deficiency of $x$. The new complexity is monotone in its condition in the sense that this complexity can only decrease if the condition is prolonged. We also briefly discuss potential generalizations to Bayesian model classes and to classification problems.

Cite as

Alexey Chernov, Marcus Hutter, and Jürgen Schmidhuber. Complexity Monotone in Conditions and Future Prediction Errors. In Kolmogorov Complexity and Applications. Dagstuhl Seminar Proceedings, Volume 6051, pp. 1-20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{chernov_et_al:DagSemProc.06051.6,
  author =	{Chernov, Alexey and Hutter, Marcus and Schmidhuber, J\"{u}rgen},
  title =	{{Complexity Monotone in Conditions and Future Prediction Errors}},
  booktitle =	{Kolmogorov Complexity and Applications},
  pages =	{1--20},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{6051},
  editor =	{Marcus Hutter and Wolfgang Merkle and Paul M.B. Vitanyi},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.06051.6},
  URN =		{urn:nbn:de:0030-drops-6327},
  doi =		{10.4230/DagSemProc.06051.6},
  annote =	{Keywords: Kolmogorov complexity, posterior bounds, online sequential prediction, Solomonoff prior, monotone conditional complexity, total error, future loss, ra}
}
Document
Error in Enumerable Sequence Prediction

Authors: Nick Hay


Abstract
We outline a method for quantifying the error of a sequence prediction. With sequence predictions represented by semimeasures $ u(x)$ we define their error to be $-log_2 u(x)$. We note that enumerable semimeasures are those which model the sequence as the output of a computable system given unknown input. Using this we define the simulation complexity of a computable system $C$ relative to another $U$ giving an emph{exact} bound on their difference in error. This error in turn gives an exact upper bound on the number of predictions $ u$ gets incorrect.

Cite as

Nick Hay. Error in Enumerable Sequence Prediction. In Kolmogorov Complexity and Applications. Dagstuhl Seminar Proceedings, Volume 6051, pp. 1-5, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{hay:DagSemProc.06051.7,
  author =	{Hay, Nick},
  title =	{{Error in Enumerable Sequence Prediction}},
  booktitle =	{Kolmogorov Complexity and Applications},
  pages =	{1--5},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{6051},
  editor =	{Marcus Hutter and Wolfgang Merkle and Paul M.B. Vitanyi},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.06051.7},
  URN =		{urn:nbn:de:0030-drops-6331},
  doi =		{10.4230/DagSemProc.06051.7},
  annote =	{Keywords: Sequence prediction, Solomonoff induction, enumerable semimeasures}
}
Document
Learning in Reactive Environments with Arbitrary Dependence

Authors: Daniil Ryabko and Marcus Hutter


Abstract
In reinforcement learning the task for an agent is to attain the best possible asymptotic reward where the true generating environment is unknown but belongs to a known countable family of environments. This task generalises the sequence prediction problem, in which the environment does not react to the behaviour of the agent. Solomonoff induction solves the sequence prediction problem for any countable class of measures; however, it is easy to see that such result is impossible for reinforcement learning - not any countable class of environments can be learnt. We find some sufficient conditions on the class of environments under which an agent exists which attains the best asymptotic reward for any environment in the class. We analyze how tight these conditions are and how they relate to different probabilistic assumptions known in reinforcement learning and related fields, such as Markov Decision Processes and mixing conditions.

Cite as

Daniil Ryabko and Marcus Hutter. Learning in Reactive Environments with Arbitrary Dependence. In Kolmogorov Complexity and Applications. Dagstuhl Seminar Proceedings, Volume 6051, pp. 1-15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{ryabko_et_al:DagSemProc.06051.8,
  author =	{Ryabko, Daniil and Hutter, Marcus},
  title =	{{Learning in  Reactive Environments with Arbitrary Dependence}},
  booktitle =	{Kolmogorov Complexity and Applications},
  pages =	{1--15},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{6051},
  editor =	{Marcus Hutter and Wolfgang Merkle and Paul M.B. Vitanyi},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.06051.8},
  URN =		{urn:nbn:de:0030-drops-6372},
  doi =		{10.4230/DagSemProc.06051.8},
  annote =	{Keywords: Reinforcement learning, asymptotic average value, self-optimizing policies, (non) Markov decision processes}
}
Document
Multisource Algorithmic Information Theory

Authors: Alexander Shen


Abstract
Multisource information theory is well known in Shannon setting. It studies the possibilities of information transfer through a network with limited capacities. Similar questions could be studied for algorithmic information theory and provide a framework for several known results and interesting questions.

Cite as

Alexander Shen. Multisource Algorithmic Information Theory. In Kolmogorov Complexity and Applications. Dagstuhl Seminar Proceedings, Volume 6051, pp. 1-12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{shen:DagSemProc.06051.9,
  author =	{Shen, Alexander},
  title =	{{Multisource Algorithmic Information Theory}},
  booktitle =	{Kolmogorov Complexity and Applications},
  pages =	{1--12},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{6051},
  editor =	{Marcus Hutter and Wolfgang Merkle and Paul M.B. Vitanyi},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.06051.9},
  URN =		{urn:nbn:de:0030-drops-6267},
  doi =		{10.4230/DagSemProc.06051.9},
  annote =	{Keywords: Kolmogorov complexity multisource information theory}
}
Document
Natural Halting Probabilities, Partial Randomness, and Zeta Functions

Authors: Christian S. Calude and Michael A. Stay


Abstract
We introduce the {it natural halting probability} and the {it natural complexity} of a Turing machine and we relate them to program-size complexity and Chaitin's halting probability. A classification of Turing machines according to their natural (Omega) halting probabilities is proposed: divergent, convergent and tuatara. We prove the existence of universal convergent and tuatara machines. Various results on randomness and partial randomness are proved. For example, we show that the natural halting probability of a universal tuatara machine is c.e. and random. A new type of partial randomness, asymptotic randomness, is introduced. Finally we show that in contrast to classical (algorithmic) randomness---which cannot be characterised in terms of plain complexity---various types of partial randomness admit such characterisations.

Cite as

Christian S. Calude and Michael A. Stay. Natural Halting Probabilities, Partial Randomness, and Zeta Functions. In Kolmogorov Complexity and Applications. Dagstuhl Seminar Proceedings, Volume 6051, p. 1, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{calude_et_al:DagSemProc.06051.10,
  author =	{Calude, Christian S. and Stay, Michael A.},
  title =	{{Natural Halting Probabilities, Partial Randomness, and Zeta Functions}},
  booktitle =	{Kolmogorov Complexity and Applications},
  pages =	{1--1},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{6051},
  editor =	{Marcus Hutter and Wolfgang Merkle and Paul M.B. Vitanyi},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.06051.10},
  URN =		{urn:nbn:de:0030-drops-6319},
  doi =		{10.4230/DagSemProc.06051.10},
  annote =	{Keywords: Natural halting probability, natural complexity}
}
Document
On impossibility of sequential algorithmic forecasting

Authors: Vladimir V'Yugin


Abstract
The problem of prediction future event given an individual sequence of past events is considered. Predictions are given in form of real numbers $p_n$ which are computed by some algorithm $varphi$ using initial fragments $omega_1,dots, omega_{n-1}$ of an individual binary sequence $omega=omega_1,omega_2,dots$ and can be interpreted as probabilities of the event $omega_n=1$ given this fragment. According to Dawid's {it prequential framework} %we do not consider %numbers $p_n$ as conditional probabilities generating by some %overall probability distribution on the set of all possible events. we consider partial forecasting algorithms $varphi$ which are defined on all initial fragments of $omega$ and can be undefined outside the given sequence of outcomes. We show that even for this large class of forecasting algorithms combining outcomes of coin-tossing and transducer algorithm it is possible to efficiently generate with probability close to one sequences for which any partial forecasting algorithm is failed by the method of verifying called {it calibration}.

Cite as

Vladimir V'Yugin. On impossibility of sequential algorithmic forecasting. In Kolmogorov Complexity and Applications. Dagstuhl Seminar Proceedings, Volume 6051, pp. 1-7, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{vyugin:DagSemProc.06051.11,
  author =	{V'Yugin, Vladimir},
  title =	{{On impossibility of sequential algorithmic forecasting}},
  booktitle =	{Kolmogorov Complexity and Applications},
  pages =	{1--7},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{6051},
  editor =	{Marcus Hutter and Wolfgang Merkle and Paul M.B. Vitanyi},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.06051.11},
  URN =		{urn:nbn:de:0030-drops-6305},
  doi =		{10.4230/DagSemProc.06051.11},
  annote =	{Keywords: Universal forecasting, computable calibration, Dawid's prequential framework, algorithmic randomness, defensive forecasting}
}
Document
Recent Results in Universal and Non-Universal Induction

Authors: Jan Poland


Abstract
We present and relate recent results in prediction based on countable classes of either probability (semi-)distributions or base predictors. Learning by Bayes, MDL, and stochastic model selection will be considered as instances of the first category. In particular, we will show how analog assertions to Solomonoff's universal induction result can be obtained for MDL and stochastic model selection. The second category is based on prediction with expert advice. We will present a recent construction to define a universal learner in this framework.

Cite as

Jan Poland. Recent Results in Universal and Non-Universal Induction. In Kolmogorov Complexity and Applications. Dagstuhl Seminar Proceedings, Volume 6051, pp. 1-11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{poland:DagSemProc.06051.12,
  author =	{Poland, Jan},
  title =	{{Recent Results in Universal and Non-Universal Induction}},
  booktitle =	{Kolmogorov Complexity and Applications},
  pages =	{1--11},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{6051},
  editor =	{Marcus Hutter and Wolfgang Merkle and Paul M.B. Vitanyi},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.06051.12},
  URN =		{urn:nbn:de:0030-drops-6355},
  doi =		{10.4230/DagSemProc.06051.12},
  annote =	{Keywords: Bayesian learning, MDL, stochastic model selection, prediction with expert advice, universal learning, Solomonoff induction}
}

Filters


Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail