Information Distance Revisited

Author Bruno Bauwens



PDF
Thumbnail PDF

File

LIPIcs.STACS.2020.46.pdf
  • Filesize: 436 kB
  • 14 pages

Document Identifiers

Author Details

Bruno Bauwens
  • National Research University Higher School of Economics, Moscow, Russia

Acknowledgements

This work was initiated by Alexander (Sasha) Shen, who informed me about the error in [Mahmud, 2009] during a discussion of the paper [Vitányi, 2017]. Afterwards I explained the proof of Theorem 13 to Sasha. He simplified it, and he wrote all of the current manuscript and the proof of Theorem 13, which is available in the ArXiv version of this paper [Bauwens and Shen, 2018]. Later, I added Theorem 14. Only this proof is written by me, and is also available on ArXiv [Bauwens and Shen, 2018]. After this was added, Sasha decided that his contribution was no longer proportional, and decided he did no longer want to remain an author. I am especially grateful for his generous permission to publish this nicely written document, with minor modifications suggested by reviewers. I thank the reviewers for these suggestions. All errors in this document are solely my responsability. I thank Mikhail Andreev for the proof of Proposition 7 and many useful discussions. Finally, I thank Artem Grachev and the participants of the Kolmogorov seminar in Moscow state university for useful discussions.

Cite AsGet BibTex

Bruno Bauwens. Information Distance Revisited. In 37th International Symposium on Theoretical Aspects of Computer Science (STACS 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 154, pp. 46:1-46:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)
https://doi.org/10.4230/LIPIcs.STACS.2020.46

Abstract

We consider the notion of information distance between two objects x and y introduced by Bennett, Gács, Li, Vitanyi, and Zurek [C. H. Bennett et al., 1998] as the minimal length of a program that computes x from y as well as computing y from x, and study different versions of this notion. In the above paper, it was shown that the prefix version of information distance equals max (K(x|y),K(y|x)) up to additive logarithmic terms. It was claimed by Mahmud [Mahmud, 2009] that this equality holds up to additive O(1)-precision. We show that this claim is false, but does hold if the distance is at least logarithmic. This implies that the original definition provides a metric on strings that are at superlogarithmically separated.

Subject Classification

ACM Subject Classification
  • Theory of computation
Keywords
  • Kolmogorov complexity
  • algorithmic information distance

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Bruno Bauwens and Alexander Shen. Information distance revisited. arXiv preprint, 2018. URL: http://arxiv.org/abs/1807.11087.
  2. C. H. Bennett, P. Gács, M. Li, P.M.B. Vitányi, and W. H. Zurek. Information distance. IEEE Transactions on Information Theory, 44(4), 1998. Google Scholar
  3. G.J. Chaitin. A theory of program size formally identical to information theory. J. Assoc. Comput. Mach., 22(3):329-340, 1975. URL: https://doi.org/10.1145/321892.321894.
  4. P. Gács. On the symmetry of algorithmic information. Soviet Math. Dokl., 15(5):1477-1480, 1974. Google Scholar
  5. P. Gács. On the relation between descriptional complexity and algorithmic probability. Theor. Comput. Sci., 22:71-93, 1983. Google Scholar
  6. Leonid A Levin. Some theorems on the algorithmic approach to probability theory and information theory. PhD thesis, BostonUniversity, MA, UnitedStates, 1971. Dissertation directed by A. N. Kolmogorov; turned down as required by the Soviet authorities despite unanimously positive reviews. Translated in English in [10]. Google Scholar
  7. Leonid A Levin. On the notion of a random sequence. Soviet Mathematics-Doklady, 14:1413-1416, 1973. Google Scholar
  8. Leonid A Levin. Laws of information conservation (nongrowth) and aspects of the foundation of probability theory. Problemy Peredachi Informatsii, 10(3):30-35, 1974. Google Scholar
  9. Leonid A Levin. Various measures of complexity for finite objects (axiomatic description). Soviet Mathematics Doklady, 17(2):522-526, 1976. Google Scholar
  10. Leonid A Levin. Some theorems on the algorithmic approach to probability theory and information theory. Annals of Pureand Applied Logic, 162:224-235, 2010. 1971 dissertation directed by A. N. Kolmogorov; turned down as required by the Soviet authorities despite unanimously positive reviews. Google Scholar
  11. Ming Li and Paul M.B. Vitányi. An Introduction to Kolmogorov Complexity and Its Applications, 4th edition. Springer, 2019. 1 ed., 1993; 2 ed., 1997, 3 ed 2008,. Google Scholar
  12. Chong Long, Xiaoyan Zhu, Ming Li, and Bin Ma. Information shared by many objects. In Proceedings of the 17th ACM conference on Information and knowledge management, pages 1213-1220. ACM, 2008. Google Scholar
  13. MM Hassan Mahmud. On universal transfer learning. Theoretical Computer Science, 410(19):1826-1846, April 2009. Google Scholar
  14. Alexander Shen, Vladimir A Uspensky, and Nikolay Vereshchagin. Kolmogorov complexity and algorithmic randomness, volume 220. Mathematical Surveys and Monographs, volume 220, xviii+511 pages. American Mathematical Society American Mathematical Soc., 2017. Draft version: URL: http://www.lirmm.fr/~ashen/kolmbook-eng.pdf.
  15. Paul M.B. Vitányi. Exact expression for information distance. IEEE Transactions on Information Theory, 63:4725-4728, 2017. URL: http://arxiv.org/abs/1410.7328.
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail