Random Noise Increases Kolmogorov Complexity and Hausdorff Dimension

Authors Gleb Posobin , Alexander Shen



PDF
Thumbnail PDF

File

LIPIcs.STACS.2019.57.pdf
  • Filesize: 0.49 MB
  • 14 pages

Document Identifiers

Author Details

Gleb Posobin
  • Computer Science department, Columbia University, New York, USA
Alexander Shen
  • LIRMM CNRS & University of Montpellier, 161 rue Ada, 34095, Montpellier, France,
  • On leave from IITP RAS, Moscow

Acknowledgements

Authors are grateful to the participants and organizers of the Heidelberg Kolmogorov complexity program where the question of the complexity increase was raised, and to all colleagues (from the ESCAPE team, LIRMM, Montpellier, Kolmogorov seminar and HSE Theoretical Computer Science Group, and other places) who participated in the discussions, in particular to B. Bauwens, N. Greenberg, K. Makarychev, Yu. Makarychev, J. Miller, A. Milovanov, F. Nazarov, I. Razenshteyn, A. Romashchenko, N. Vereshchagin, L. B. Westrick, and, last but not least, to Peter Gács who explained us how the tools from [Ahlswede et al., 1976] can be used to provide the desired result about Kolmogorov complexity.

Cite AsGet BibTex

Gleb Posobin and Alexander Shen. Random Noise Increases Kolmogorov Complexity and Hausdorff Dimension. In 36th International Symposium on Theoretical Aspects of Computer Science (STACS 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 126, pp. 57:1-57:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)
https://doi.org/10.4230/LIPIcs.STACS.2019.57

Abstract

Consider a bit string x of length n and Kolmogorov complexity alpha n (for some alpha<1). It is always possible to increase the complexity of x by changing a small fraction of bits in x [Harry Buhrman et al., 2005]. What happens with the complexity of x when we randomly change each bit independently with some probability tau? We prove that a linear increase in complexity happens with high probability, but this increase is smaller than in the case of arbitrary change considered in [Harry Buhrman et al., 2005]. The amount of the increase depends on x (strings of the same complexity could behave differently). We give exact lower and upper bounds for this increase (with o(n) precision). The same technique is used to prove the results about the (effective Hausdorff) dimension of infinite sequences. We show that random change increases the dimension with probability 1, and provide an optimal lower bound for the dimension of the changed sequence. We also improve a result from [Noam Greenberg et al., 2018] and show that for every sequence omega of dimension alpha there exists a strongly alpha-random sequence omega' such that the Besicovitch distance between omega and omega' is 0. The proofs use the combinatorial and probabilistic reformulations of complexity statements and the technique that goes back to Ahlswede, Gács and Körner [Ahlswede et al., 1976].

Subject Classification

ACM Subject Classification
  • Theory of computation → Randomness, geometry and discrete structures
Keywords
  • Kolmogorov complexity
  • effective Hausdorff dimension
  • random noise

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Rudolf Ahlswede, Peter Gács, and János Körner. Bounds on conditional probabilities with applications in multi-user communication. Zeitschrift für Wahrscheinlichkeitstheorie und verwandte Gebiete, 34(2):157-177, 1976. Google Scholar
  2. Harry Buhrman, Lance Fortnow, Ilan Newman, and Nikolai K. Vereshchagin. Increasing Kolmogorov Complexity. In Volker Diekert and Bruno Durand, editors, STACS 2005, 22nd Annual Symposium on Theoretical Aspects of Computer Science, Stuttgart, Germany, February 24-26, 2005, Proceedings, volume 3404 of Lecture Notes in Computer Science, pages 412-421. Springer, 2005. URL: http://dx.doi.org/10.1007/978-3-540-31856-9_34.
  3. Rodney G. Downey and Denis R. Hirschfeldt. Algorithmic Randomness and Complexity. Theory and Applications of Computability. Springer, 2010. URL: http://dx.doi.org/10.1007/978-0-387-68441-3.
  4. Peter Frankl and Zoltán Füredi. A short proof for a theorem of Harper about Hamming-spheres. Discrete Mathematics, 34(3):311-313, 1981. URL: http://dx.doi.org/10.1016/0012-365X(81)90009-1.
  5. Noam Greenberg, Joseph S. Miller, Alexander Shen, and Linda Brown Westrick. Dimension 1 sequences are close to randoms. Theoretical Computer Science, 705:99-112, 2018. URL: http://dx.doi.org/10.1016/j.tcs.2017.09.031.
  6. Wassily Hoeffding. Probability inequalities for sums of bounded random variables. Journal of the American statistical association, 58(301):13-30, 1963. Google Scholar
  7. Andrei N. Kolmogorov. Three approaches to the quantitative definition of information. Problems of Information Transmission, 1(1):3-11, 1965. Google Scholar
  8. Ming Li and Paul M. B. Vitányi. An Introduction to Kolmogorov Complexity and Its Applications, Third Edition. Texts in Computer Science. Springer, 2008. URL: http://dx.doi.org/10.1007/978-0-387-49820-1.
  9. Grigory A. Margulis. Probabilistic properties of highly connected graphs (Вероятностные характеристики графов с большой связностью). Problems of Information Transmission, 10(2):174-179, 1974. Google Scholar
  10. Katalin Marton. A simple proof of the blowing-up lemma. IEEE Transactions on Information Theory, 32(3):445-446, 1986. URL: http://dx.doi.org/10.1109/TIT.1986.1057176.
  11. Colin McDiarmid. On the method of bounded differences, pages 148-188. London Mathematical Society Lecture Note Series. Cambridge University Press, 1989. URL: http://dx.doi.org/10.1017/CBO9781107359949.008.
  12. Alexander Shen, Vladimir A. Uspensky, and Nikolay Vereshchagin. Kolmogorov complexity and algorithmic randomness, volume 220. American Mathematical Society, 2017. Google Scholar
  13. Nikolai K. Vereshchagin and Alexander Shen. Algorithmic statistics: forty years later. In Computability and Complexity. Essays Dedicated to Rodney G. Downey on the Occasion of His 60th Birthday. Lecture Notes in Computer Science, v. 10010, pages 669-737. Springer, July 2017. URL: http://arxiv.org/abs/1607.08077.
  14. Nikolai K. Vereshchagin and Paul M. B. Vitányi. Rate Distortion and Denoising of Individual Data Using Kolmogorov Complexity. IEEE Transactions on Information Theory, 56(7):3438-3454, 2010. Google Scholar
  15. Aaron D. Wyner and Jacob Ziv. A Theorem on the Entropy of Certain Binary Sequences and Applications: Part I. IEEE Transactions on Information Theory, 19(6):769-772, 1973. URL: http://dx.doi.org/10.1109/TIT.1973.1055107.
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail