Information Complexity Is Computable

Authors Mark Braverman, Jon Schneider



PDF
Thumbnail PDF

File

LIPIcs.ICALP.2016.87.pdf
  • Filesize: 461 kB
  • 10 pages

Document Identifiers

Author Details

Mark Braverman
Jon Schneider

Cite As Get BibTex

Mark Braverman and Jon Schneider. Information Complexity Is Computable. In 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 55, pp. 87:1-87:10, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016) https://doi.org/10.4230/LIPIcs.ICALP.2016.87

Abstract

The information complexity of a function f is the minimum amount of information Alice and Bob need to exchange to compute the function f. In this paper we provide an algorithm for approximating the information complexity of an arbitrary function f to within any additive error epsilon > 0, thus resolving an open question as to whether information complexity is computable. 

In the process, we give the first explicit upper bound on the rate of convergence of the information complexity of f when restricted to b-bit protocols to the (unrestricted) information complexity of f.

Subject Classification

Keywords
  • Communication complexity
  • convergence rate
  • information complexity

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Noga Alon and Eyal Lubetzky. The shannon capacity of a graph and the independence numbers of its powers. Information Theory, IEEE Transactions on, 52(5):2172-2176, 2006. Google Scholar
  2. Ziv Bar-Yossef, Thathachar S Jayram, Ravindra Kumar, and D Sivakumar. An information statistics approach to data stream and communication complexity. In Foundations of Computer Science, 2002. Proceedings. The 43rd Annual IEEE Symposium on, pages 209-218. IEEE, 2002. Google Scholar
  3. Boaz Barak, Mark Braverman, Xi Chen, and Anup Rao. How to compress interactive communication. SIAM Journal on Computing, 42(3):1327-1363, 2013. Google Scholar
  4. Richard Beigel and Jun Tarui. On acc [circuit complexity]. In Foundations of Computer Science, 1991. Proceedings., 32nd Annual Symposium on, pages 783-792. IEEE, 1991. Google Scholar
  5. Mark Braverman. Interactive information complexity. SIAM Journal on Computing, 44(6):1698-1739, 2015. Google Scholar
  6. Mark Braverman, Ankit Garg, Denis Pankratov, and Omri Weinstein. From information to exact communication. In Proceedings of the forty-fifth annual ACM symposium on Theory of computing, pages 151-160. ACM, 2013. Google Scholar
  7. Mark Braverman and Akhila Rao. Information equals amortized communication. Information Theory, IEEE Transactions on, 60(10):6058-6069, 2014. Google Scholar
  8. Amit Chakrabart, Yaoyun Shi, Anthony Wirth, and Andrew Yao. Informational complexity and the direct sum problem for simultaneous message complexity. In Foundations of Computer Science, 2001. Proceedings. 42nd IEEE Symposium on, pages 270-278. IEEE, 2001. Google Scholar
  9. Richard Cleve, Peter Høyer, Benjamin Toner, and John Watrous. Consequences and limits of nonlocal strategies. In Computational Complexity, 2004. Proceedings. 19th IEEE Annual Conference on, pages 236-249. IEEE, 2004. Google Scholar
  10. Thomas M. Cover and Joy A. Thomas. Elements of Information Theory. Wiley series in telecommunications. J. Wiley and Sons, New York, 1991. Google Scholar
  11. Rahul Jain. New strong direct product results in communication complexity. In Electronic Colloquium on Computational Complexity (ECCC), volume 18, page 2, 2011. Google Scholar
  12. Mauricio Karchmer and Avi Wigderson. Monotone circuits for connectivity require super-logarithmic depth. SIAM Journal on Discrete Mathematics, 3(2):255-265, 1990. Google Scholar
  13. Nan Ma and Prakash Ishwar. Two-terminal distributed source coding with alternating messages for function computation. In Information Theory, 2008. ISIT 2008. IEEE International Symposium on, pages 51-55. IEEE, 2008. Google Scholar
  14. Nan Ma and Prakash Ishwar. Some results on distributed source coding for interactive function computation. Information Theory, IEEE Transactions on, 57(9):6180-6195, 2011. Google Scholar
  15. Claude Elwood Shannon. A mathematical theory of communication. ACM SIGMOBILE Mobile Computing and Communications Review, 5(1):3-55, 2001. Google Scholar
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail