An Efficient Coding Theorem via Probabilistic Representations and Its Applications

Authors Zhenjian Lu, Igor C. Oliveira



PDF
Thumbnail PDF

File

LIPIcs.ICALP.2021.94.pdf
  • Filesize: 0.79 MB
  • 20 pages

Document Identifiers

Author Details

Zhenjian Lu
  • University of Warwick, Coventry, UK
Igor C. Oliveira
  • University of Warwick, Coventry, UK

Acknowledgements

We are grateful to Lijie Chen for a suggestion that allowed us to improve the list size in Corollary 3. We also thank Rahul Santhanam and Rahul Ilango for discussions on search-to-decision reductions, and James Maynard for an email exchange on number-theoretic techniques and the problem of generating primes. Igor would like to thank Michal Koucký for asking a question about the probability threshold in the definition of rKt during the Dagstuhl Seminar "Computational Complexity of Discrete Problems" (2019). This work received support from the Royal Society University Research Fellowship URF⧵R1⧵191059.

Cite AsGet BibTex

Zhenjian Lu and Igor C. Oliveira. An Efficient Coding Theorem via Probabilistic Representations and Its Applications. In 48th International Colloquium on Automata, Languages, and Programming (ICALP 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 198, pp. 94:1-94:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)
https://doi.org/10.4230/LIPIcs.ICALP.2021.94

Abstract

A probabilistic representation of a string x ∈ {0,1}ⁿ is given by the code of a randomized algorithm that outputs x with high probability [Igor C. Oliveira, 2019]. We employ probabilistic representations to establish the first unconditional Coding Theorem in time-bounded Kolmogorov complexity. More precisely, we show that if a distribution ensemble 𝒟_m can be uniformly sampled in time T(m) and generates a string x ∈ {0,1}^* with probability at least δ, then x admits a time-bounded probabilistic representation of complexity O(log(1/δ) + log (T) + log(m)). Under mild assumptions, a representation of this form can be computed from x and the code of the sampler in time polynomial in n = |x|. We derive consequences of this result relevant to the study of data compression, pseudodeterministic algorithms, time hierarchies for sampling distributions, and complexity lower bounds. In particular, we describe an instance-based search-to-decision reduction for Levin’s Kt complexity [Leonid A. Levin, 1984] and its probabilistic analogue rKt [Igor C. Oliveira, 2019]. As a consequence, if a string x admits a succinct time-bounded representation, then a near-optimal representation can be generated from x with high probability in polynomial time. This partially addresses in a time-bounded setting a question from [Leonid A. Levin, 1984] on the efficiency of computing an optimal encoding of a string.

Subject Classification

ACM Subject Classification
  • Theory of computation
Keywords
  • computational complexity
  • randomized algorithms
  • Kolmogorov complexity

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Eric Allender. Applications of time-bounded Kolmogorov complexity in complexity theory. In Kolmogorov complexity and computational complexity, pages 4-22. Springer, 1992. Google Scholar
  2. Eric Allender. When worlds collide: Derandomization, lower bounds, and Kolmogorov complexity. In International Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS), pages 1-15. Springer, 2001. Google Scholar
  3. Eric Allender. The complexity of complexity. In Computability and Complexity, pages 79-94. Springer, 2017. Google Scholar
  4. Eric Allender, Harry Buhrman, Michal Koucký, Dieter van Melkebeek, and Detlef Ronneburger. Power from random strings. SIAM J. Comput., 35(6):1467-1493, 2006. Google Scholar
  5. Luis Filipe Coelho Antunes and Lance Fortnow. Worst-case running times for average-case algorithms. In Conference on Computational Complexity (CCC), pages 298-303, 2009. Google Scholar
  6. Bruno Bauwens, Anton Makhlin, Nikolai K. Vereshchagin, and Marius Zimand. Short lists with short programs in short time. Comput. Complex., 27(1):31-61, 2018. Google Scholar
  7. Bruno Bauwens and Marius Zimand. Linear list-approximation for short programs (or the power of a few random bits). In Conference on Computational Complexity (CCC), pages 241-247, 2014. Google Scholar
  8. Harry Buhrman, Troy Lee, and Dieter van Melkebeek. Language compression and pseudorandom generators. Comput. Complex., 14(3):228-255, 2005. Google Scholar
  9. Marco L. Carmosino, Russell Impagliazzo, Valentine Kabanets, and Antonina Kolokolova. Learning algorithms from natural proofs. In Conference on Computational Complexity (CCC), pages 10:1-10:24, 2016. Google Scholar
  10. Lijie Chen, Ce Jin, and R. Ryan Williams. Hardness magnification for all sparse NP languages. In Symposium on Foundations of Computer Science (FOCS), pages 1240-1255, 2019. Google Scholar
  11. Lijie Chen, Ce Jin, and R. Ryan Williams. Sharp threshold results for computational complexity. In Symposium on Theory of Computing (STOC), 2020. Google Scholar
  12. Thomas M. Cover and Joy A. Thomas. Elements of Information Theory. Wiley, 2006. Google Scholar
  13. Lance Fortnow. Kolmogorov complexity and computational complexity. Complexity of Computations and Proofs. Quaderni di Matematica, 13, 2004. Google Scholar
  14. Ofer Gabber and Zvi Galil. Explicit constructions of linear-sized superconcentrators. J. Comput. Syst. Sci., 22(3):407-420, 1981. Google Scholar
  15. David Gillman. A Chernoff bound for random walks on expander graphs. SIAM J. Comput., 27(4):1203-1220, 1998. Google Scholar
  16. Andrew V. Goldberg and Michael Sipser. Compression and ranking. SIAM J. Comput., 20(3):524-536, 1991. Google Scholar
  17. Shuichi Hirahara. Non-black-box worst-case to average-case reductions within NP. In Symposium on Foundations of Computer Science (FOCS), pages 247-258, 2018. Google Scholar
  18. Shuichi Hirahara. Non-disjoint promise problems from meta-computational view of pseudorandom generator constructions. In Conference on Computational Complexity (CCC), 2020. Google Scholar
  19. Rahul Ilango. Connecting Perebor conjectures: Towards a search to decision reduction for minimizing formulas. In Computational Complexity Conference (CCC), 2020. Google Scholar
  20. Rahul Ilango, Bruno Loff, and Igor C. Oliveira. NP-hardness of circuit minimization for multi-output functions. In Computational Complexity Conference (CCC), 2020. Google Scholar
  21. Troy Lee. Kolmogorov complexity and formula lower bounds. PhD thesis, University of Amsterdam, 2006. Google Scholar
  22. Leonid A. Levin. Laws of information conservation (nongrowth) and aspects of the foundation of probability theory. Problemy Peredachi Informatsii, 10(3):30-35, 1974. Google Scholar
  23. Leonid A. Levin. Randomness conservation inequalities; information and independence in mathematical theories. Information and Control, 61(1):15-37, 1984. Google Scholar
  24. Ming Li and Paul M. B. Vitányi. An Introduction to Kolmogorov Complexity and Its Applications. Texts in Computer Science. Springer, 2008. Google Scholar
  25. Yanyi Liu and Rafael Pass. On one-way functions and Kolmogorov complexity. IACR Cryptol. ePrint Arch., 2020:423, 2020. Google Scholar
  26. Luc Longpré and Sarah Mocas. Symmetry of information and one-way functions. Inf. Process. Lett., 46(2):95-100, 1993. Google Scholar
  27. Luc Longpré and Osamu Watanabe. On symmetry of information and polynomial time invertibility. Inf. Comput., 121(1):14-22, 1995. Google Scholar
  28. Zhenjian Lu and Igor Carboni Oliveira. An efficient coding theorem via probabilistic representations and its applications. Electron. Colloquium Comput. Complex., 28:41, 2021. Google Scholar
  29. James Maynard. The twin prime conjecture. Japanese Journal of Mathematics, 14:175-206, 2019. Google Scholar
  30. Igor C. Oliveira. Randomness and intractability in Kolmogorov complexity. In International Colloquium on Automata, Languages, and Programming (ICALP), pages 32:1-32:14, 2019. Google Scholar
  31. Igor C. Oliveira, Ján Pich, and Rahul Santhanam. Hardness magnification near state-of-the-art lower bounds. In Computational Complexity Conference (CCC), pages 27:1-27:29, 2019. Google Scholar
  32. Igor C. Oliveira and Rahul Santhanam. Pseudodeterministic constructions in subexponential time. In Symposium on Theory of Computing (STOC), pages 665-677, 2017. Google Scholar
  33. Alexander A. Razborov and Steven Rudich. Natural proofs. J. Comput. Syst. Sci., 55(1):24-35, 1997. Google Scholar
  34. Alexander Shen, Vladimir A. Uspensky, and Nikolay Vereshchagin. Kolmogorov complexity and algorithmic randomness. American Mathematical Society, 2017. Google Scholar
  35. Michael Sipser. A complexity theoretic approach to randomness. In Symposium on Theory of Computing (STOC), pages 330-335, 1983. Google Scholar
  36. Daniel A. Spielman. Linear-time encodable and decodable error-correcting codes. IEEE Trans. Inf. Theory, 42(6):1723-1731, 1996. Google Scholar
  37. Terence Tao, Ernest Croot, III, and Harald Helfgott. Deterministic methods to find primes. Math. Comp., 81(278):1233-1246, 2012. Google Scholar
  38. Gérald Tenenbaum. Introduction to analytic and probabilistic number theory, volume 163. American Mathematical Society, 2015. Google Scholar
  39. Luca Trevisan, Salil P. Vadhan, and David Zuckerman. Compression of samplable sources. Comput. Complex., 14(3):186-227, 2005. Google Scholar
  40. Gregory Valiant. Lecture Notes for CS265/CME309: Randomized Algorithms and Probabilistic Analysis (Lecture 6), 2019. Google Scholar
  41. Thomas Watson. Time hierarchies for sampling distributions. SIAM J. Comput., 43(5):1709-1727, 2014. Google Scholar
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail