Internal Compression of Protocols to Entropy

Authors Balthazar Bauer, Shay Moran, Amir Yehudayoff

Thumbnail PDF


  • Filesize: 491 kB
  • 16 pages

Document Identifiers

Author Details

Balthazar Bauer
Shay Moran
Amir Yehudayoff

Cite AsGet BibTex

Balthazar Bauer, Shay Moran, and Amir Yehudayoff. Internal Compression of Protocols to Entropy. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2015). Leibniz International Proceedings in Informatics (LIPIcs), Volume 40, pp. 481-496, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


We study internal compression of communication protocols to their internal entropy, which is the entropy of the transcript from the players' perspective. We provide two internal compression schemes with error. One of a protocol of Feige et al. for finding the first difference between two strings. The second and main one is an internal compression with error epsilon > 0 of a protocol with internal entropy H^{int} and communication complexity C to a protocol with communication at most order (H^{int}/epsilon)^2 * log(log(C)). This immediately implies a similar compression to the internal information of public-coin protocols, which provides an exponential improvement over previously known public-coin compressions in the dependence on C. It further shows that in a recent protocol of Ganor, Kol and Raz, it is impossible to move the private randomness to be public without an exponential cost. To the best of our knowledge, No such example was previously known.
  • Communication complexity
  • Information complexity
  • Compression
  • Simulation
  • Entropy


  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    PDF Downloads


  1. Ziv Bar-Yossef, T. S. Jayram, Ravi Kumar, and D. Sivakumar. An information statistics approach to data stream and communication complexity. J. Comput. Syst. Sci., 68(4):702-732, 2004. Google Scholar
  2. Boaz Barak, Mark Braverman, Xi Chen, and Anup Rao. How to compress interactive communication. SIAM J. Comput., 42(3):1327-1363, 2013. Google Scholar
  3. Mark Braverman. Interactive information complexity. In STOC, pages 505-524, 2012. Google Scholar
  4. Mark Braverman and Ankit Garg. Public vs private coin in bounded-round information. In ICALP (1), pages 502-513, 2014. Google Scholar
  5. Mark Braverman and Anup Rao. Information equals amortized communication. In FOCS, pages 748-757, 2011. Google Scholar
  6. Mark Braverman, Anup Rao, Omri Weinstein, and Amir Yehudayoff. Direct product via round-preserving compression. In ICALP (1), pages 232-243, 2013. Google Scholar
  7. Mark Braverman, Anup Rao, Omri Weinstein, and Amir Yehudayoff. Direct products in communication complexity. In FOCS, pages 746-755, 2013. Google Scholar
  8. Joshua Brody, Harry Buhrman, Michal Koucký, Bruno Loff, Florian Speelman, and Nikolay K. Vereshchagin. Towards a reverse newman’s theorem in interactive information complexity. In IEEE Conference on Computational Complexity, pages 24-33, 2013. Google Scholar
  9. Amit Chakrabarti, Yaoyun Shi, Anthony Wirth, , and Andrew Yao. Informational complexity and the direct sum problem for simultaneous message complexity. Proceedings of the 42nd Annual IEEE Symposium on Foundations of Computer Science, pages 270-278, 2001. Google Scholar
  10. Thomas M. Cover and Joy A. Thomas. Elements of Information Theory. Wiley Interscience, 2006. Google Scholar
  11. Martin Dietzfelbinger and Henning Wunderlich. A characterization of average case communication complexity. Inf. Process. Lett., 101(6):245-249, 2007. Google Scholar
  12. R. M. Fano. The transmission of information. Technical Report 65, Research Laboratory for Electronics, MIT, Cambridge, MA, USA, 1949. Google Scholar
  13. Uriel Feige, David Peleg, Prabhakar Raghavan, and Eli Upfal. Computing with noisy information. SIAM Journal on Computing, 23(5):1001-1018, 1994. Google Scholar
  14. Abbas El Gamal and Alon Orlitsky. Interactive data compression. 2013 IEEE 54th Annual Symposium on Foundations of Computer Science, 0:100-108, 1984. Google Scholar
  15. Anat Ganor, Gillat Kol, and Ran Raz. Exponential separation of information and communication. Electronic Colloquium on Computational Complexity, 2014. Google Scholar
  16. Prahladh Harsha, Rahul Jain, David A. McAllester, and Jaikumar Radhakrishnan. The communication complexity of correlation. IEEE Transactions on Information Theory, 56(1):438-449, 2010. Google Scholar
  17. David A. Huffman. A method for the construction of minimum-redundancy codes. Proceedings of the IRE, 40(9):1098-1101, September 1952. Google Scholar
  18. Gillat Kol, Shay Moran, Amir Shpilka, and Amir Yehudayoff. Direct sum fails for zero error average communication. In ITCS, pages 517-522, 2014. Google Scholar
  19. Eyal Kushilevitz and Noam Nisan. Communication complexity. Cambridge University Press, 1997. Google Scholar
  20. Alon Orlitsky Moni Naor and Peter Shor. Three results on interactive communication. Information Theory, IEEE Transactions on, 39(5):1608-1615, Sep 1993. Google Scholar
  21. Ilan Newman. Private vs. common random bits in communication complexity. Inf. Process. Lett., 39(2):67-71, 1991. Google Scholar
  22. Alon Orlitsky. Average-case interactive communication. Information Theory, IEEE Transactions on, 38(5):1534-1547, Sep 1992. Google Scholar
  23. Alon Orlitsky and Abbas El Gamal. Average and randomized communication complexity. IEEE Transactions on Information Theory, 36(1):3-16, 1990. Google Scholar
  24. Denis Pankratov. Direct sum questions in classical communication complexity. PhD thesis, University of Chicago, 2012. Google Scholar
  25. C.E. Shannon. A mathematical theory of communication. Bell System Technical Journal, 27:379-423, 623-656, 1948. Google Scholar
  26. David Slepian and Rahul Jack K. Wolf. Noiseless coding of correlate information sources. IEEE Transactions on Information Theory, 19(4), July 1973. Google Scholar
  27. Emanuele Viola. The communication complexity of addition. In SODA, pages 632-651, 2013. Google Scholar
  28. Andrew Chi-Chih Yao. Some complexity questions related to distributive computing (preliminary report). In STOC, pages 209-213, 1979. Google Scholar