Rényi Information Complexity and an Information Theoretic Characterization of the Partition Bound

Authors Manoj M. Prabhakaran, Vinod M. Prabhakaran



PDF
Thumbnail PDF

File

LIPIcs.ICALP.2016.88.pdf
  • Filesize: 0.64 MB
  • 14 pages

Document Identifiers

Author Details

Manoj M. Prabhakaran
Vinod M. Prabhakaran

Cite As Get BibTex

Manoj M. Prabhakaran and Vinod M. Prabhakaran. Rényi Information Complexity and an Information Theoretic Characterization of the Partition Bound. In 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 55, pp. 88:1-88:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016) https://doi.org/10.4230/LIPIcs.ICALP.2016.88

Abstract

In this work we introduce a new information-theoretic complexity measure for 2-party functions, called Rényi information complexity. It is a lower-bound on communication complexity, and has the two leading lower-bounds on communication complexity as its natural relaxations: (external) information complexity and logarithm of partition complexity. These two lower-bounds had so far appeared conceptually quite different from each other, but we show that they are both obtained from Rényi information complexity using two different, but natural relaxations:

1. The relaxation of Rényi information complexity that yields information complexity is to change the order of Rényi mutual information used in its definition from infinity to 1.

2. The relaxation that connects Rényi information complexity with partition complexity is to replace protocol transcripts used in the definition of Rényi information complexity with what we term "pseudotranscripts", which omits the interactive nature of a protocol, but only requires that the probability of any transcript given inputs x and y to the two parties, factorizes into two terms which depend on x and y separately. While this relaxation yields an apparently different definition than (log of) partition function, we show that the two are in fact identical. This gives us a surprising characterization of the partition bound in terms of an information-theoretic quantity.

We also show that if both the above relaxations are simultaneously applied to Rényi information complexity, we obtain a complexity measure that is lower-bounded by the (log of) relaxed partition complexity, a complexity measure introduced by Kerenidis et al. (FOCS 2012). We obtain a sharper connection between (external) information complexity and relaxed partition complexity than Kerenidis et al., using an arguably more direct proof.

Further understanding Rényi information complexity (of various orders) might have consequences for important direct-sum problems in communication complexity, as it lies between communication complexity and information complexity.

Subject Classification

Keywords
  • Information Complexity
  • Communication Complexity
  • Rényi Mutual Information

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Farid M. Ablayev. Lower bounds for one-way probabilistic communication complexity and their application to space complexity. Theor. Comput. Sci., 157(2):139-159, 1996. URL: http://dx.doi.org/10.1016/0304-3975(95)00157-3.
  2. Ziv Bar-Yossef, T. S. Jayram, Ravi Kumar, and D. Sivakumar. An information statistics approach to data stream and communication complexity. J. Comput. Syst. Sci., 68(4):702-732, 2004. URL: http://dx.doi.org/10.1016/j.jcss.2003.11.006.
  3. Boaz Barak, Mark Braverman, Xi Chen, and Anup Rao. How to compress interactive communication. SIAM J. Comput., 42(3):1327-1363, 2013. URL: http://dx.doi.org/10.1137/100811969.
  4. Mark Braverman. Interactive information complexity. In STOC, pages 505-524, 2012. URL: http://dx.doi.org/10.1145/2213977.2214025.
  5. Mark Braverman and Anup Rao. Information equals amortized communication. In FOCS, pages 748-757, 2011. URL: http://dx.doi.org/10.1109/FOCS.2011.86.
  6. Mark Braverman and Omri Weinstein. A discrepancy lower bound for information complexity. In APPROX-RANDOM, pages 459-470, 2012. URL: http://dx.doi.org/10.1007/978-3-642-32512-0_39.
  7. Amit Chakrabarti, Ranganath Kondapally, and Zhenghui Wang. Information complexity versus corruption and applications to orthogonality and gap-hamming. In APPROX-RANDOM, pages 483-494, 2012. URL: http://dx.doi.org/10.1007/978-3-642-32512-0_41.
  8. Amit Chakrabarti, Yaoyun Shi, Anthony Wirth, and Andrew Chi-Chih Yao. Informational complexity and the direct sum problem for simultaneous message complexity. In FOCS, pages 270-278, 2001. URL: http://dx.doi.org/10.1109/SFCS.2001.959901.
  9. Lila Fontes, Rahul Jain, Iordanis Kerenidis, Sophie Laplante, Mathieu Laurière, and Jérémie Roland. Relative discrepancy does not separate information and communication complexity. In ICALP, pages 506-516, 2015. Google Scholar
  10. Anat Ganor, Gillat Kol, and Ran Raz. Exponential separation of information and communication. In FOCS, pages 176-185, 2014. Google Scholar
  11. Anat Ganor, Gillat Kol, and Ran Raz. Exponential separation of communication and external information. In Electronic Colloquium on Computational Complexity (ECCC), 2015. Google Scholar
  12. Prahladh Harsha, Rahul Jain, David McAllester, and Jaikumar Radhakrishnan. The communication complexity of correlation. IEEE Transactions on Information Theory, 56(1):438-449, 2010. URL: http://dx.doi.org/10.1109/TIT.2009.2034824.
  13. Siu-Wai Ho and Sergio Verdú. Convexity/concavity of Rényi entropy and α-mutual information. In Information Theory (ISIT), 2015 IEEE International Symposium on, pages 745-749, 2015. Google Scholar
  14. Rahul Jain and Hartmut Klauck. The partition bound for classical communication complexity and query complexity. In IEEE Conference on Computational Complexity, pages 247-258, 2010. Google Scholar
  15. Rahul Jain, Jaikumar Radhakrishnan, and Pranab Sen. A direct sum theorem in communication complexity via message compression. In ICALP, pages 300-315, 2003. URL: http://dx.doi.org/10.1007/3-540-45061-0_26.
  16. Rahul Jain, Jaikumar Radhakrishnan, and Pranab Sen. Prior entanglement, message compression and privacy in quantum communication. In IEEE Conference on Computational Complexity, pages 285-296, 2005. URL: http://dx.doi.org/10.1109/CCC.2005.24.
  17. T. S. Jayram, Ravi Kumar, and D. Sivakumar. Two applications of information complexity. In STOC, pages 673-682, 2003. URL: http://dx.doi.org/10.1145/780542.780640.
  18. Iordanis Kerenidis, Sophie Laplante, Virginie Lerays, Jérémie Roland, and David Xiao. Lower bounds on information complexity via zero-communication protocols and applications. In FOCS, pages 500-509, 2012. Google Scholar
  19. Eyal Kushilevitz and Noam Nisan. Communication complexity. Cambridge University Press, New York, 1997. Google Scholar
  20. Stephen J Ponzio, Jaikumar Radhakrishnan, and Srinivasan Venkatesh. The communication complexity of pointer chasing. Journal of Computer and System Sciences, 62(2):323-355, 2001. Google Scholar
  21. Manoj Prabhakaran and Vinod M. Prabhakaran. Tension bounds for information complexity. CoRR, abs/1408.6285, 2014. URL: http://arxiv.org/abs/1408.6285.
  22. Alfred Rényi. On measures of information and entropy. In Proceedings of the 4th Berkeley Symposium on Mathematics, Statistics and Probability, pages 547-561, 1960. Google Scholar
  23. Michael E. Saks and Xiaodong Sun. Space lower bounds for distance approximation in the data stream model. In STOC, pages 360-369, 2002. URL: http://dx.doi.org/10.1145/509907.509963.
  24. R. Sibson. Information radius. Z. Wahrscheinlichkeitstheorie und Verw. Geb., 14:149-161, 1969. Google Scholar
  25. Sergio Verdú. α-mutual information. In Information Theory and Applications Workshop (ITA), 2015. Google Scholar
  26. Andrew Chi-Chih Yao. Some complexity questions related to distributive computing (preliminary report). In STOC, pages 209-213, 1979. URL: http://dx.doi.org/10.1145/800135.804414.
  27. Moshe Zakai and Jacob Ziv. A generalization of the rate-distortion theory and applications. In Information Theory New Trends and Open Problems, pages 87-123. Springer, 1975. Google Scholar
  28. Jacob Ziv and Moshe Zakai. On functionals satisfying a data-processing theorem. Information Theory, IEEE Transactions on, 19(3):275-283, 1973. Google Scholar
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail