An Operational Characterization of Mutual Information in Algorithmic Information Theory

Authors Andrei Romashchenko, Marius Zimand



PDF
Thumbnail PDF

File

LIPIcs.ICALP.2018.95.pdf
  • Filesize: 413 kB
  • 14 pages

Document Identifiers

Author Details

Andrei Romashchenko
  • LIRMM, Univ Montpellier, CNRS, Montpellier, France
  • on leave from IITP RAS
Marius Zimand
  • Department of Computer and Information Sciences, Towson University, Baltimore, MD, http://orion.towson.edu/~mzimand

Cite AsGet BibTex

Andrei Romashchenko and Marius Zimand. An Operational Characterization of Mutual Information in Algorithmic Information Theory. In 45th International Colloquium on Automata, Languages, and Programming (ICALP 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 107, pp. 95:1-95:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)
https://doi.org/10.4230/LIPIcs.ICALP.2018.95

Abstract

We show that the mutual information, in the sense of Kolmogorov complexity, of any pair of strings x and y is equal, up to logarithmic precision, to the length of the longest shared secret key that two parties, one having x and the complexity profile of the pair and the other one having y and the complexity profile of the pair, can establish via a probabilistic protocol with interaction on a public channel. For l > 2, the longest shared secret that can be established from a tuple of strings (x_1, . . . , x_l) by l parties, each one having one component of the tuple and the complexity profile of the tuple, is equal, up to logarithmic precision, to the complexity of the tuple minus the minimum communication necessary for distributing the tuple to all parties. We establish the communication complexity of secret key agreement protocols that produce a secret key of maximal length, for protocols with public randomness. We also show that if the communication complexity drops below the established threshold then only very short secret keys can be obtained.

Subject Classification

ACM Subject Classification
  • Mathematics of computing → Information theory
  • Theory of computation → Communication complexity
  • Theory of computation → Probabilistic computation
Keywords
  • Kolmogorov complexity
  • mutual information
  • communication complexity
  • secret key agreement

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Rudolf Ahlswede and Imre Csiszár. Common randomness in information theory and cryptography - I: secret sharing. IEEE Trans. Information Theory, 39(4):1121-1132, 1993. URL: http://dx.doi.org/10.1109/18.243431.
  2. Luis Antunes, Sophie Laplante, Alexandre Pinto, and Liliana Salvador. Cryptographic security of individual instances. ICITS, pages 195-210, 2010. Google Scholar
  3. Charles H. Bennett, Gilles Brassard, and Jean-Marc Robert. Privacy amplification by public discussion. SIAM Journal on Computing, 17(2):210-229, 1988. Google Scholar
  4. Gregory J Chaitin. A theory of program size formally identical to information theory. Journal of the ACM (JACM), 22(3):329-340, 1975. Google Scholar
  5. Chung Chan, Ali Al-Bashabsheh, Javad B Ebrahimi, Tarik Kaced, and Tie Liu. Multivariate mutual information inspired by secret-key agreement. Proceedings of the IEEE, 3(10):1883-1913, 2015. Google Scholar
  6. Alexey V. Chernov, Andrei A. Muchnik, Andrei E. Romashchenko, Alexander Shen, and Nikolai K. Vereshchagin. Upper semi-lattice of binary strings with the relation "x is simple conditional to y". Theor. Comput. Sci., 271(1-2):69-95, 2002. URL: http://dx.doi.org/10.1016/S0304-3975(01)00032-9.
  7. Imre Csiszár and János Körner. Broadcast channels with confidential messages. IEEE Trans. Information Theory, 24(3):339-348, 1978. URL: http://dx.doi.org/10.1109/TIT.1978.1055892.
  8. Imre Csiszár and Prakash Narayan. Secrecy capacities for multiple terminals. IEEE Trans. Information Theory, 50(12):3047-3061, 2004. URL: http://dx.doi.org/10.1109/TIT.2004.838380.
  9. Peter Gács and János Körner. Common information is far less than mutual information. Probl. Control Inf. Theory, 2(2):149-162, 1973. Google Scholar
  10. Venkatesan Guruswami and Adam Smith. Codes for computationally simple channels: Explicit constructions with optimal rate. Foundations of Computer Science (FOCS), 2010 51st Annual IEEE Symposium on, pages 723-732, 2010. URL: http://dx.doi.org/10.1109/FOCS.2010.74.
  11. Venkatesan Guruswami and Adam Smith. Optimal rate code constructions for computationally simple channels. Journal of the ACM (JACM), 63(4):35, 2016. Google Scholar
  12. Yasuichi Horibe. A note on Kolmogorov complexity and entropy. Applied mathematics letters, 16(7):1129-1130, 2003. Google Scholar
  13. Tarik Kaced and Andrei Romashchenko. Conditional information inequalities for entropic and almost entropic points. IEEE Transactions on Information Theory, 59(11):7149-7167, 2013. Google Scholar
  14. Tarik Kaced, Andrei Romashchenko, and Nikolay Vereshchagin. Conditional information inequalities and combinatorial applications. arXiv preprint arXiv:1501.04867, 2015. Google Scholar
  15. Sik Kow Leung-Yan-Cheong. Multi-user and wiretap channels including feedback, July 1976. Tech. Rep. No. 6603-2, Stanford Univ. Google Scholar
  16. Ueli M. Maurer. Conditionally-perfect secrecy and a provably-secure randomized cipher. Journal of Cryptology, 5(1):53-66, 1992. Google Scholar
  17. Ueli M. Maurer. Secret key agreement by public discussion from common information. IEEE Trans. Information Theory, 39(3):733-742, 1993. URL: http://dx.doi.org/10.1109/18.256484.
  18. Li Ming and Paul M.B. Vitányi. Kolmogorov complexity and its applications. Elsevier, 2014. Google Scholar
  19. Andrei A. Muchnik. On common information. Theor. Comput. Sci., 207:319-328, 1998. Google Scholar
  20. Andrei A. Muchnik and Andrei E. Romashchenko. Stability of properties of Kolmogorov complexity under relativization. Problems of information transmission, 46(1):38-61, 2010. Google Scholar
  21. Ilya Razenshteyn. Common information revisited. arXiv preprint arXiv:1104.3207, 2011. Google Scholar
  22. Andrei Romashchenko. Pairs of words with nonmaterializable mutual information. Problems of Information Transmission, 36(1):3-20, 2000. Google Scholar
  23. Alexander Shen, Vladimir Uspensky, and Nikolay Vereshchagin. Kolmogorov complexity and algorithmic randomness. American Mathematical Society, 2017. Google Scholar
  24. Alexander Kh. Shen. The concept of (α, β)-stochasticity in the Kolmogorov sense, and its properties. Soviet Math. Dokl., 28(1):295-299, 1983. Google Scholar
  25. Adam D. Smith. Scrambling adversarial errors using few random bits, optimal information reconciliation, and better private codes. Symposium on Discrete Algorithms: Proceedings of the eighteenth annual ACM-SIAM symposium on Discrete algorithms, 7(09):395-404, 2007. Google Scholar
  26. Himanshu Tyagi. Common information and secret key capacity. IEEE Transactions on Information Theory, 59(9):5627-5640, 2013. Google Scholar
  27. Aaron D. Wyner. The wire-tap channel. Bell Syst. Tech J., 54(8):1355-1387, 1975. Google Scholar
  28. Marius Zimand. Kolmogorov complexity version of Slepian-Wolf coding. In STOC 2017, pages 22-32. ACM, June 2017. Google Scholar
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail