Communication Complexity of the Secret Key Agreement in Algorithmic Information Theory
It is known that the mutual information, in the sense of Kolmogorov complexity, of any pair of strings x and y is equal to the length of the longest shared secret key that two parties can establish via a probabilistic protocol with interaction on a public channel, assuming that the parties hold as their inputs x and y respectively. We determine the worst-case communication complexity of this problem for the setting where the parties can use private sources of random bits.
We show that for some x, y the communication complexity of the secret key agreement does not decrease even if the parties have to agree on a secret key the size of which is much smaller than the mutual information between x and y. On the other hand, we provide examples of x, y such that the communication complexity of the protocol declines gradually with the size of the derived secret key.
The proof of the main result uses spectral properties of appropriate graphs and the expander mixing lemma as well as various information theoretic techniques.
Kolmogorov complexity
mutual information
communication complexity
expander mixing lemma
finite geometry
Mathematics of computing~Information theory
Theory of computation~Communication complexity
Security and privacy~Information-theoretic techniques
Theory of computation~Expander graphs and randomness extractors
44:1-44:14
Regular Paper
Supported in part by the ANR project RaCAF ANR-15-CE40-0016-01.
(the full version of the paper): https://arxiv.org/abs/2004.13411
We thank the anonymous reviewers for instructive suggestions and corrections concerning this conference publication as well as the full version of the paper.
Emirhan
Gürpınar
Emirhan Gürpınar
LIRMM, Université de Montpellier, CNRS, France
Andrei
Romashchenko
Andrei Romashchenko
LIRMM, Université de Montpellier, CNRS, France
https://orcid.org/0000-0001-7723-7880
is grateful to the Max Planck Institute for Mathematics in the Sciences (Leipzig, Germany) for hospitality, and thanks Rostislav Matveev and Jacobus Portegies for useful discussions.
10.4230/LIPIcs.MFCS.2020.44
Rudolf Ahlswede and Imre Csiszár. Common randomness in information theory and cryptography. I. Secret sharing. IEEE Transactions on Information Theory, 39(4):1121-1132, 1993.
Noga Alon and Joel H. Spencer. The probabilistic method. John Wiley & Sons, 2004.
Andries E. Brouwer, Sebastian M. Cioabă, Ferdinand Ihringer, and Matt McGinnis. The smallest eigenvalues of hamming graphs, johnson graphs and other distance-regular graphs with classical parameters. Journal of Combinatorial Theory, Series B, 133:88-121, 2018.
Alexei Chernov, Andrej Muchnik, Andrei Romashchenko, Alexander Shen, and Nikolai Vereshchagin. Upper semi-lattice of binary strings with the relation “x is simple conditional to y”. Theoretical Computer Science, 271(1-2):69-95, 2002.
Shai Evra, Konstantin Golubev, and Alexander Lubotzky. Mixing properties and the chromatic number of ramanujan complexes. International Mathematics Research Notices, 2015(22):11520-11548, 2015.
Shlomo Hoory, Nathan Linial, and Avi Wigderson. Expander graphs and their applications. Bulletin of the American Mathematical Society, 43(4):439-561, 2006.
Tarik Kaced, Andrei Romashchenko, and Nikolai Vereshchagin. A conditional information inequality and its combinatorial applications. IEEE Transactions on Information Theory, 64(5):3610-3615, 2018.
Eyal Kushilevitz and Noam Nisan. Communication Complexity. Cambridge University Press, 2006.
Ming Li and Paul Vitányi. An introduction to Kolmogorov complexity and its applications. Springer, 4 edition, 2019.
Jingbo Liu, Paul Cuff, and Sergio Verdú. Secret key generation with limited interaction. IEEE Transactions on Information Theory, 63(11):7358-7381, 2017.
Xiaogang Liu and Sanming Zhou. Eigenvalues of cayley graphs. arXiv preprint arXiv:1809.09829, 2018.
Ueli M. Maurer. Secret key agreement by public discussion from common information. IEEE transactions on information theory, 39(3):733-742, 1993.
Archie Medrano, Perla Myers, Harold M. Stark, and Audrey Terras. Finite analogues of euclidean space. Journal of Computational and Applied Mathematics, 68(1-2):221-238, 1996.
Andrej A. Muchnik. Conditional complexity and codes. Theoretical Computer Science, 271(1-2):97-109, 2002.
Omer Reingold, Salil Vadhan, and Avi Wigderson. Entropy waves, the zig-zag graph product, and new constant-degree expanders and extractors. In Proceedings 41st Annual Symposium on Foundations of Computer Science, pages 3-13. IEEE, 2000.
Andrei Romashchenko and Marius Zimand. An operational characterization of mutual information in algorithmic information theory. Journal of the ACM (JACM), 66(5):1-42, 2019.
Andrei E. Romashchenko. Pairs of words with nonmaterializable mutual information. Problems of Information Transmission, 36(1), 2000.
Alexander Shen, Vladimir Uspensky, and Nikolay Vereshchagin. Kolmogorov complexity and algorithmic randomness, volume 220. American Mathematical Soc., 2017.
Madhu Sudan, Himanshu Tyagi, and Shun Watanabe. Communication for generating correlation: A unifying survey. IEEE Transactions on Information Theory, 66(1):5-37, 2019.
Himanshu Tyagi. Common information and secret key capacity. IEEE Transactions on Information Theory, 59(9):5627-5640, 2013.
Le Anh Vinh. The Szemerédi-Trotter type theorem and the sum-product estimate in finite fields. European Journal of Combinatorics, 32(8):1177-1181, 2011.
Alexander K. Zvonkin and Leonid A. Levin. The complexity of finite objects and the development of the concepts of information and randomness by means of the theory of algorithms. Russian Mathematical Surveys, 25(6):83, 1970.
Emirhan Gürpınar and Andrei Romashchenko
Creative Commons Attribution 3.0 Unported license
https://creativecommons.org/licenses/by/3.0/legalcode