Optimal Coding Theorems in Time-Bounded Kolmogorov Complexity

Authors Zhenjian Lu, Igor C. Oliveira, Marius Zimand



PDF
Thumbnail PDF

File

LIPIcs.ICALP.2022.92.pdf
  • Filesize: 0.73 MB
  • 14 pages

Document Identifiers

Author Details

Zhenjian Lu
  • University of Warwick, Coventry, UK
Igor C. Oliveira
  • University of Warwick, Coventry, UK
Marius Zimand
  • Towson University, MD, USA

Acknowledgements

We thank Bruno Bauwens for discussions and useful insights.

Cite As Get BibTex

Zhenjian Lu, Igor C. Oliveira, and Marius Zimand. Optimal Coding Theorems in Time-Bounded Kolmogorov Complexity. In 49th International Colloquium on Automata, Languages, and Programming (ICALP 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 229, pp. 92:1-92:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022) https://doi.org/10.4230/LIPIcs.ICALP.2022.92

Abstract

The classical coding theorem in Kolmogorov complexity states that if an n-bit string x is sampled with probability δ by an algorithm with prefix-free domain then 𝖪(x) ≤ log(1/δ) + O(1). In a recent work, Lu and Oliveira [Zhenjian Lu and Igor C. Oliveira, 2021] established an unconditional time-bounded version of this result, by showing that if x can be efficiently sampled with probability δ then rKt(x) = O(log(1/δ)) + O(log n), where rKt denotes the randomized analogue of Levin’s Kt complexity. Unfortunately, this result is often insufficient when transferring applications of the classical coding theorem to the time-bounded setting, as it achieves a O(log(1/δ)) bound instead of the information-theoretic optimal log(1/δ).
Motivated by this discrepancy, we investigate optimal coding theorems in the time-bounded setting. Our main contributions can be summarised as follows.
• Efficient coding theorem for rKt with a factor of 2. Addressing a question from [Zhenjian Lu and Igor C. Oliveira, 2021], we show that if x can be efficiently sampled with probability at least δ then rKt(x) ≤ (2 + o(1)) ⋅ log(1/δ) + O(log n). As in previous work, our coding theorem is efficient in the sense that it provides a polynomial-time probabilistic algorithm that, when given x, the code of the sampler, and δ, it outputs, with probability ≥ 0.99, a probabilistic representation of x that certifies this rKt complexity bound.
• Optimality under a cryptographic assumption. Under a hypothesis about the security of cryptographic pseudorandom generators, we show that no efficient coding theorem can achieve a bound of the form rKt(x) ≤ (2 - o(1)) ⋅ log(1/δ) + poly(log n). Under a weaker assumption, we exhibit a gap between efficient coding theorems and existential coding theorems with near-optimal parameters.
• Optimal coding theorem for pK^t and unconditional Antunes-Fortnow. We consider pK^t complexity [Halley Goldberg et al., 2022], a variant of rKt where the randomness is public and the time bound is fixed. We observe the existence of an optimal coding theorem for pK^t, and employ this result to establish an unconditional version of a theorem of Antunes and Fortnow [Luis Filipe Coelho Antunes and Lance Fortnow, 2009] which characterizes the worst-case running times of languages that are in average polynomial-time over all 𝖯-samplable distributions.

Subject Classification

ACM Subject Classification
  • Theory of computation
Keywords
  • computational complexity
  • randomized algorithms
  • Kolmogorov complexity

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Scott Aaronson. The equivalence of sampling and searching. Theory Comput. Syst., 55(2):281-298, 2014. Google Scholar
  2. Eric Allender. Applications of time-bounded Kolmogorov complexity in complexity theory. In Kolmogorov complexity and computational complexity, pages 4-22. Springer, 1992. Google Scholar
  3. Eric Allender. When worlds collide: Derandomization, lower bounds, and Kolmogorov complexity. In International Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS), pages 1-15. Springer, 2001. Google Scholar
  4. Eric Allender. The complexity of complexity. In Computability and Complexity, pages 79-94. Springer, 2017. Google Scholar
  5. Luis Filipe Coelho Antunes and Lance Fortnow. Worst-case running times for average-case algorithms. In Conference on Computational Complexity (CCC), pages 298-303, 2009. Google Scholar
  6. Bruno Bauwens and Marius Zimand. Universal almost optimal compression and Slepian-Wolf coding in probabilistic polynomial time. CoRR, abs/1911.04268, 2019. URL: http://arxiv.org/abs/1911.04268.
  7. Shai Ben-David, Benny Chor, Oded Goldreich, and Michael Luby. On the theory of average case complexity. J. Comput. Syst. Sci., 44(2):193-219, 1992. URL: https://doi.org/10.1016/0022-0000(92)90019-F.
  8. Andrej Bogdanov and Luca Trevisan. Average-case complexity. Found. Trends Theor. Comput. Sci., 2(1), 2006. Google Scholar
  9. Harry Buhrman, Lance Fortnow, and Sophie Laplante. Resource-bounded Kolmogorov complexity revisited. SIAM J. Comput., 31(3):887-905, 2001. Google Scholar
  10. Harry Buhrman, Troy Lee, and Dieter van Melkebeek. Language compression and pseudorandom generators. Comput. Complex., 14(3):228-255, 2005. Google Scholar
  11. M. R. Capalbo, O. Reingold, S. P. Vadhan, and A. Wigderson. Randomness conductors and constant-degree lossless expanders. In STOC, pages 659-668, 2002. URL: https://doi.org/10.1145/509907.510003.
  12. Li Chen, Rasmus Kyng, Yang P. Liu, Richard Peng, Maximilian Probst Gutenberg, and Sushant Sachdeva. Maximum flow and minimum-cost flow in almost-linear time. CoRR, abs/2203.00671, 2022. URL: https://doi.org/10.48550/arXiv.2203.00671.
  13. Kai-Min Chung, Siyao Guo, Qipeng Liu, and Luowen Qian. Tight quantum time-space tradeoffs for function inversion. In Symposium on Foundations of Computer Science (FOCS), pages 673-684, 2020. Google Scholar
  14. Anindya De, Luca Trevisan, and Madhur Tulsiani. Time space tradeoffs for attacks against one-way functions and PRGs. In Annual International Cryptology Conference (CRYPTO), pages 649-665, 2010. Google Scholar
  15. Amos Fiat and Moni Naor. Rigorous time/space trade-offs for inverting functions. SIAM J. Comput., 29(3):790-803, 1999. URL: https://doi.org/10.1137/S0097539795280512.
  16. Lance Fortnow. Kolmogorov complexity and computational complexity. Complexity of Computations and Proofs. Quaderni di Matematica, 13, 2004. Google Scholar
  17. Halley Goldberg, Valentine Kabanets, Zhenjian Lu, and Igor C. Oliveira. Probabilistic Kolmogorov complexity with applications to average-case complexity. Preprint, 2022. Google Scholar
  18. Venkatesan Guruswami, Christopher Umans, and Salil P. Vadhan. Unbalanced expanders and randomness extractors from Parvaresh-Vardy codes. J. ACM, 56(4):20:1-20:34, 2009. Google Scholar
  19. Iftach Haitner, Omer Reingold, and Salil P. Vadhan. Efficiency improvements in constructing pseudorandom generators from one-way functions. SIAM J. Comput., 42(3):1405-1430, 2013. Google Scholar
  20. Shuichi Hirahara. Non-black-box worst-case to average-case reductions within NP. In Symposium on Foundations of Computer Science (FOCS), pages 247-258, 2018. Google Scholar
  21. Shuichi Hirahara. Average-case hardness of NP from exponential worst-case hardness assumptions. In Symposium on Theory of Computing (STOC), pages 292-302, 2021. Google Scholar
  22. S. Hoory, N. Linial, and A. Wigderson. Expander graphs and their applications. Bull. Amer. Math. Soc., 43:439-561, 2006. Google Scholar
  23. Russell Impagliazzo and Ramamohan Paturi. On the complexity of k-SAT. J. Comput. Syst. Sci., 62(2):367-375, 2001. URL: https://doi.org/10.1006/jcss.2000.1727.
  24. Jan Krajíček. Information in propositional proofs and algorithmic proof search. The Journal of Symbolic Logic, pages 1-22, 2021. Google Scholar
  25. Troy Lee. Kolmogorov complexity and formula lower bounds. PhD thesis, University of Amsterdam, 2006. Google Scholar
  26. Leonid A. Levin. Randomness conservation inequalities; information and independence in mathematical theories. Information and Control, 61(1):15-37, 1984. Google Scholar
  27. Leonid A. Levin. Average case complete problems. SIAM J. Comput., 15(1):285-286, 1986. URL: https://doi.org/10.1137/0215020.
  28. Ming Li and Paul M. B. Vitányi. Average case complexity under the universal distribution equals worst-case complexity. Inf. Process. Lett., 42(3):145-149, 1992. Google Scholar
  29. Ming Li and Paul M. B. Vitányi. An introduction to Kolmogorov complexity and its applications. Springer-Verlag, 2019. 4th edition (1st edition in 1993). Google Scholar
  30. Yanyi Liu and Rafael Pass. On one-way functions and Kolmogorov complexity. In Symposium on Foundations of Computer Science (FOCS), pages 1243-1254, 2020. Google Scholar
  31. Zhenjian Lu and Igor C. Oliveira. An efficient coding theorem via probabilistic representations and its applications. In International Colloquium on Automata, Languages, and Programming (ICALP), pages 94:1-94:20, 2021. Google Scholar
  32. Zhenjian Lu, Igor C. Oliveira, and Rahul Santhanam. Pseudodeterministic algorithms and the structure of probabilistic time. In Symposium on Theory of Computing (STOC), pages 303-316, 2021. Google Scholar
  33. Zhenjian Lu, Igor C. Oliveira, and Marius Zimand. Optimal coding theorems in time-bounded Kolmogorov complexity, 2022. URL: https://doi.org/10.48550/ARXIV.2204.08312.
  34. Aleksander Madry. Navigating central path with electrical flows: From flows to matchings, and back. In Symposium on Foundations of Computer Science (FOCS), pages 253-262, 2013. URL: https://doi.org/10.1109/FOCS.2013.35.
  35. Igor C. Oliveira. Randomness and intractability in Kolmogorov complexity. In International Colloquium on Automata, Languages, and Programming (ICALP), pages 32:1-32:14, 2019. Google Scholar
  36. Hanlin Ren and Rahul Santhanam. Hardness of KT characterizes parallel cryptography. In Computational Complexity Conference (CCC), pages 35:1-35:58, 2021. Google Scholar
  37. Michael Sipser. A complexity theoretic approach to randomness. In Symposium on Theory of Computing (STOC), pages 330-335, 1983. Google Scholar
  38. Salil P. Vadhan and Colin Jia Zheng. A uniform min-max theorem with applications in cryptography. In Annual Cryptology Conference (CRYPTO), pages 93-110, 2013. Google Scholar
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail