Document Open Access Logo

An Entropy Sumset Inequality and Polynomially Fast Convergence to Shannon Capacity Over All Alphabets

Authors Venkatesan Guruswami, Ameya Velingker



PDF
Thumbnail PDF

File

LIPIcs.CCC.2015.42.pdf
  • Filesize: 0.52 MB
  • 16 pages

Document Identifiers

Author Details

Venkatesan Guruswami
Ameya Velingker

Cite AsGet BibTex

Venkatesan Guruswami and Ameya Velingker. An Entropy Sumset Inequality and Polynomially Fast Convergence to Shannon Capacity Over All Alphabets. In 30th Conference on Computational Complexity (CCC 2015). Leibniz International Proceedings in Informatics (LIPIcs), Volume 33, pp. 42-57, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)
https://doi.org/10.4230/LIPIcs.CCC.2015.42

Abstract

We prove a lower estimate on the increase in entropy when two copies of a conditional random variable X | Y, with X supported on Z_q={0,1,...,q-1} for prime q, are summed modulo q. Specifically, given two i.i.d. copies (X_1,Y_1) and (X_2,Y_2) of a pair of random variables (X,Y), with X taking values in Z_q, we show H(X_1 + X_2 \mid Y_1, Y_2) - H(X|Y) >=e alpha(q) * H(X|Y) (1-H(X|Y)) for some alpha(q) > 0, where H(.) is the normalized (by factor log_2(q)) entropy. In particular, if X | Y is not close to being fully random or fully deterministic and H(X| Y) \in (gamma,1-gamma), then the entropy of the sum increases by Omega_q(gamma). Our motivation is an effective analysis of the finite-length behavior of polar codes, for which the linear dependence on gamma is quantitatively important. The assumption of q being prime is necessary: for X supported uniformly on a proper subgroup of Z_q we have H(X+X)=H(X). For X supported on infinite groups without a finite subgroup (the torsion-free case) and no conditioning, a sumset inequality for the absolute increase in (unnormalized) entropy was shown by Tao in [Tao, CP&R 2010]. We use our sumset inequality to analyze Ari kan's construction of polar codes and prove that for any q-ary source X, where q is any fixed prime, and anyepsilon > 0, polar codes allow efficient data compression of N i.i.d. copies of X into (H(X)+epsilon)N q-ary symbols, as soon as N is polynomially large in 1/epsilon. We can get capacity-achieving source codes with similar guarantees for composite alphabets, by factoring q into primes and combining different polar codes for each prime in factorization. A consequence of our result for noisy channel coding is that for all discrete memoryless channels, there are explicit codes enabling reliable communication within epsilon > 0 of the symmetric Shannon capacity for a block length and decoding complexity bounded by a polynomial in 1/epsilon. The result was previously shown for the special case of binary-input channels [Guruswami/Xial, FOCS'13; Hassani/Alishahi/Urbanke, CoRR 2013], and this work extends the result to channels over any alphabet.
Keywords
  • Polar codes
  • polynomial gap to capacity
  • entropy sumset inequality
  • arbitrary alphabets

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Emmanuel Abbe and Emre Telatar. Polar codes for the m-user multiple access channel. IEEE Transactions on Information Theory, 58(8):5437-5448, 2012. Google Scholar
  2. Erdal Arıkan. Channel polarization: a method for constructing capacity-achieving codes for symmetric binary-input memoryless channels. IEEE Transactions on Information Theory, 55(7):3051-3073, 2009. Google Scholar
  3. Erdal Arıkan. Source polarization. In Proceedings of 2010 IEEE International Symposium on Information Theory, pages 899-903, 2010. Google Scholar
  4. Erdal Arıkan and Emre Telatar. On the rate of channel polarization. In Proceedings of 2009 IEEE International Symposium on Information Theory, pages 1493-1495, 2009. Google Scholar
  5. Naveen Goela, Emmanuel Abbe, and Michael Gastpar. Polar codes for broadcast channels. In Proceedings of the 2013 IEEE International Symposium on Information Theory, Istanbul, Turkey, July 7-12, 2013, pages 1127-1131, 2013. Google Scholar
  6. Venkatesan Guruswami and Ameya Velingker. An entropy sumset inequality and polynomially fast convergence to shannon capacity over all alphabets. Electronic Colloquium on Computational Complexity (ECCC), 21:165, 2014. Google Scholar
  7. Venkatesan Guruswami and Patrick Xia. Polar codes: Speed of polarization and polynomial gap to capacity. In FOCS, pages 310-319, 2013. Full version to appear in IEEE Trans. on Info. Theory, Jan. 2015. Google Scholar
  8. Saeid Haghighatshoar, Emmanuel Abbe, and Ì. Emre Telatar. A new entropy power inequality for integer-valued random variables. IEEE Transactions on Information Theory, 60(7):3787-3796, 2014. Google Scholar
  9. Seyed Hamed Hassani, Kasra Alishahi, and Rüdiger L. Urbanke. Finite-length scaling of polar codes. CoRR, abs/1304.4778, 2013. Google Scholar
  10. Varun Jog and Venkat Anantharam. The entropy power inequality and mrs. gerber’s lemma for groups of order 2ⁿ. IEEE Transactions on Information Theory, 60(7):3773-3786, 2014. Google Scholar
  11. Satish Babu Korada. Polar codes for Slepian-Wolf, Wyner-Ziv, and Gelfand-Pinsker. In Proceedings of the 2010 IEEE Information Theory Workshop, pages 1-5, 2010. Google Scholar
  12. Satish Babu Korada, Eren Sasoglu, and Rüdiger L. Urbanke. Polar codes: Characterization of exponent, bounds, and constructions. IEEE Transactions on Information Theory, 56(12):6253-6264, 2010. Google Scholar
  13. Satish Babu Korada and Rüdiger L. Urbanke. Polar codes are optimal for lossy source coding. IEEE Transactions on Information Theory, 56(4):1751-1768, 2010. Google Scholar
  14. Jingbo Liu and Emmanuel Abbe. Polynomial complexity of polar codes for non-binary alphabets, key agreement and slepian-wolf coding. In 48th Annual Conference on Information Sciences and Systems, CISS 2014, Princeton, NJ, USA, March 19-21, 2014, pages 1-6, 2014. Available online at http://arxiv.org/abs/1405.0776. Google Scholar
  15. Hessam Mahdavifar and Alexander Vardy. Achieving the secrecy capacity of wiretap channels using polar codes. IEEE Transactions on Information Theory, 57(10):6428-6443, 2011. Google Scholar
  16. Eren Sasoglu. An entropy inequality for q-ary random variables and its application to channel polarization. In ISIT, pages 1360-1363. IEEE, 2010. Google Scholar
  17. Eren Sasoglu. Polarization and polar codes. Foundations and Trends in Communications and Information Theory, 8(4):259-381, 2012. Google Scholar
  18. Eren Sasoglu, Emre Telatar, and Erdal Arıkan. Polarization for arbitrary discrete memoryless channels. CoRR, abs/0908.0302, 2009. Google Scholar
  19. Eren Sasoglu, Emre Telatar, and Edmund M. Yeh. Polar codes for the two-user multiple-access channel. IEEE Transactions on Information Theory, 59(10):6583-6592, 2013. Google Scholar
  20. Terence Tao. Sumset and inverse sumset theory for Shannon entropy. Combinatorics, Probability and Computing, 19(4):603-639, 2010. Google Scholar
  21. Lele Wang and Eren Sasoglu. Polar coding for interference networks. CoRR, abs/1401.7293, 2014. Google Scholar
  22. Hans S. Witsenhausen. Entropy inequalities for discrete channels. IEEE Transactions on Information Theory, 20(5):610-616, 1974. Google Scholar
  23. Aaron D. Wyner and Jacob Ziv. A theorem on the entropy of certain binary sequences and applications-I. IEEE Transactions on Information Theory, 19(6):769-772, 1973. Google Scholar
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail