Tight Bounds for Communication-Assisted Agreement Distillation

Authors Venkatesan Guruswami, Jaikumar Radhakrishnan

Thumbnail PDF


  • Filesize: 0.54 MB
  • 17 pages

Document Identifiers

Author Details

Venkatesan Guruswami
Jaikumar Radhakrishnan

Cite AsGet BibTex

Venkatesan Guruswami and Jaikumar Radhakrishnan. Tight Bounds for Communication-Assisted Agreement Distillation. In 31st Conference on Computational Complexity (CCC 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 50, pp. 6:1-6:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Suppose Alice holds a uniformly random string X in {0,1}^N and Bob holds a noisy version Y of X where each bit of X is flipped independently with probability epsilon in [0,1/2]. Alice and Bob would like to extract a common random string of min-entropy at least k. In this work, we establish the communication versus success probability trade-off for this problem by giving a protocol and a matching lower bound (under the restriction that the string to be agreed upon is determined by Alice's input X). Specifically, we prove that in order for Alice and Bob to agree on a common string with probability 2^{-gamma k} (gamma k >= 1), the optimal communication (up to o(k) terms, and achievable for large N) is precisely (C *(1-gamma) - 2 * sqrt{ C * (1-C) gamma}) * k, where C := 4 * epsilon * (1-epsilon). In particular, the optimal communication to achieve Omega(1) agreement probability approaches 4 * epsilon * (1-epsilon) * k. We also consider the case when Y is the output of the binary erasure channel on X, where each bit of Y equals the corresponding bit of X with probability 1-epsilon and is otherwise erased (that is, replaced by a "?"). In this case, the communication required becomes (epsilon * (1-gamma) - 2 * sqrt{ epsilon * (1-epsilon) * gamma}) * k. In particular, the optimal communication to achieve Omega(1) agreement probability approaches epsilon * k, and with no communication the optimal agreement probability approaches 2^{- (1-sqrt{1-epsilon})/(1+sqrt{1-epsilon}) * k}. Our protocols are based on covering codes and extend the approach of (Bogdanov and Mossel, 2011) for the zero-communication case. Our lower bounds rely on hypercontractive inequalities. For the model of bit-flips, our argument extends the approach of (Bogdanov and Mossel, 2011) by allowing communication; for the erasure model, to the best of our knowledge the needed hypercontractivity statement was not studied before, and it was established (given our application) by (Nair and Wang 2015). We also obtain information complexity lower bounds for these tasks, and together with our protocol, they shed light on the recently popular "most informative Boolean function" conjecture of Courtade and Kumar.
  • communication complexity
  • covering codes
  • hypercontractivity
  • information theory
  • lower bounds
  • pseudorandomness


  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    PDF Downloads


  1. Rudolf Ahlswede and Imre Csiszár. Common randomness in information theory and cryptography - part II: CR capacity. IEEE Transactions on Information Theory, 44(1):225-240, 1998. Google Scholar
  2. Venkat Anantharam, Amin Aminzadeh Gohari, Sudeep Kamath, and Chandra Nair. On hypercontractivity and the mutual information between boolean functions. In Proceedings of the 51st Annual Allerton Conference on Communication, Control, and Computing, pages 13-19, 2013. URL: http://dx.doi.org/10.1109/Allerton.2013.6736499.
  3. Venkat Anantharam, Amin Aminzadeh Gohari, Sudeep Kamath, and Chandra Nair. On maximal correlation, hypercontractivity, and the data processing inequality studied by Erkip and Cover. CoRR, abs/1304.6133, 2013. URL: http://arxiv.org/abs/1304.6133.
  4. Venkat Anantharam, Amin Aminzadeh Gohari, Sudeep Kamath, and Chandra Nair. On hypercontractivity and a data processing inequality. In 2014 IEEE International Symposium on Information Theory, pages 3022-3026, 2014. URL: http://dx.doi.org/10.1109/ISIT.2014.6875389.
  5. Andrej Bogdanov and Elchanan Mossel. On extracting common random bits from correlated sources. IEEE Transactions on Information Theory, 57(10):6351-6355, 2011. Google Scholar
  6. Clément Louis Canonne, Venkatesan Guruswami, Raghu Meka, and Madhu Sudan. Communication with imperfectly shared randomness. In Proceedings of the 2015 Conference on Innovations in Theoretical Computer Science, pages 257-262, 2015. URL: http://dx.doi.org/10.1145/2688073.2688099.
  7. Venkat Chandar and Aslan Tchamkerten. Most informative quantization functions. In Proceedings of ITA Workshop, 2014. Available at http://perso.telecom-paristech.fr/ tchamker/CTAT.pdf. Google Scholar
  8. John D. Cook. Upper and lower bounds for the normal distribution function, 2009. URL: http://www.johndcook.com/normalbounds.pdf.
  9. Thomas A. Courtade and Gowtham R. Kumar. Which Boolean functions maximize mutual information on noisy inputs? IEEE Transactions on Information Theory, 60(8):4515-4525, 2014. URL: http://dx.doi.org/10.1109/TIT.2014.2326877.
  10. Elza Erkip. The efficiency of information in investment. PhD thesis, Stanford University, 1996. Google Scholar
  11. Elza Erkip and Thomas M. Cover. The efficiency of investment information. IEEE Transactions on Information Theory, 44(3):1026-1040, 1998. URL: http://dx.doi.org/10.1109/18.669153.
  12. William Feller. An Introduction to Probability Theory and Its Applications, volume 2. John Wiley and Sons, 1971. Google Scholar
  13. Péter Gács and János Körner. Common information is far less than mutual information. Problems of Control and Information Theory, 2(2):119-162, 1972. Google Scholar
  14. Guy Kindler, Ryan O'Donnell, and David Witmer. Remarks on the most informative function conjecture at fixed mean. CoRR, abs/1506.03167, 2015. URL: http://arxiv.org/abs/1506.03167.
  15. Konstantin Makarychev and Yury Makarychev. Chain independence and common information. IEEE Transactions on Information Theory, 58(8):5279-5286, 2012. Google Scholar
  16. Elchanan Mossel and Ryan O'Donnell. Coin flipping from a cosmic source: On error correction of truly random bits. Random Struct. Algorithms, 26(4):418-436, 2005. URL: http://dx.doi.org/10.1002/rsa.20062.
  17. Elchanan Mossel, Ryan O'Donnell, Oded Regev, Jeffrey Steif, and Benny Sudakov. Non-interactive correlation distillation, inhomogeneous Markov chains, and the reverse bonami–beckner inequality. Israel J. Math., 154:299–336, 2006. Google Scholar
  18. Chandra Nair and Yannan Wang. Evaluating hypercontractivity parameters using information measures. Manuscript, 2015. Google Scholar
  19. Ryan O'Donnell. Analysis of Boolean Functions. Cambridge University Press, 2014. URL: http://www.cambridge.org/de/academic/subjects/computer-science/algorithmics-complexity-computer-algebra-and-computational-g/analysis-boolean-functions.
  20. Or Ordentlich, Ofer Shayevitz, and Omri Weinstein. Dictatorship is the most informative balanced function at the extremes. Electronic Colloquium on Computational Complexity (ECCC), 22:84, 2015. URL: http://eccc.hpi-web.de/report/2015/084.
  21. Andrei Romashchenko. Pairs of words with nonmaterializable mutual information. Problems of Information Transmission, 36(1):1-18, 2000. Google Scholar
  22. Alex Samorodnitsky. The "most informative boolean function" conjecture holds for high noise. CoRR, abs/1510.08656, 2015. Google Scholar
  23. Alex Samorodnitsky. On the entropy of a noisy function. Electronic Colloquium on Computational Complexity (ECCC), 22:129, 2015. Google Scholar
  24. Aaron D. Wyner and Jacob Ziv. A theorem on the entropy of certain binary sequences and applications: Part 1. IEEE Transactions on Information Theory, 19(6):769-772, 1973. URL: http://dx.doi.org/10.1109/TIT.1973.1055107.
  25. Ke Yang. On the (im)possibility of non-interactive correlation distillation. Theor. Comput. Sci., 382(2):157-166, 2007. URL: http://dx.doi.org/10.1016/j.tcs.2007.03.007.
  26. Lei Zhao and Yeow-Khiang Chia. The efficiency of common randomness generation. In Proceedings of 49th Annual Allerton Conference on Communication, Control, and Computing, pages 944-950, 2011. Google Scholar