2 Search Results for "Urbanke, R�diger"


Document
Coding Theory for Inference, Learning and Optimization (Dagstuhl Seminar 18112)

Authors: Po-Ling Loh, Arya Mazumdar, Dimitris Papailiopoulos, and Rüdiger Urbanke

Published in: Dagstuhl Reports, Volume 8, Issue 3 (2018)


Abstract
This report documents the program and the outcomes of Dagstuhl Seminar 18112, "Coding Theory for Inference, Learning and Optimization." Coding theory has recently found new applications in areas such as distributed machine learning, dimension reduction, and variety of statistical problems involving estimation and inference. In machine learning applications that use large-scale data, it is desirable to communicate the results of distributed computations in an efficient and robust manner. In dimension reduction applications, the pseudorandom properties of algebraic codes may be used to construct projection matrices that are deterministic and facilitate algorithmic efficiency. Finally, relationships that have been forged between coding theory and problems in theoretical computer science, such as k-SAT or the planted clique problem, lead to a new interpretation of the sharp thresholds encountered in these settings in terms of thresholds in channel coding theory. The aim of this Dagstuhl Seminar was to draw together researchers from industry and academia that are working in coding theory, particularly in these different (and somewhat disparate) application areas of machine learning and inference. The discussions and collaborations facilitated by this seminar were intended to spark new ideas about how coding theory may be used to improve and inform modern techniques for data analytics.

Cite as

Po-Ling Loh, Arya Mazumdar, Dimitris Papailiopoulos, and Rüdiger Urbanke. Coding Theory for Inference, Learning and Optimization (Dagstuhl Seminar 18112). In Dagstuhl Reports, Volume 8, Issue 3, pp. 60-73, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@Article{loh_et_al:DagRep.8.3.60,
  author =	{Loh, Po-Ling and Mazumdar, Arya and Papailiopoulos, Dimitris and Urbanke, R\"{u}diger},
  title =	{{Coding Theory for Inference, Learning and Optimization (Dagstuhl Seminar 18112)}},
  pages =	{60--73},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2018},
  volume =	{8},
  number =	{3},
  editor =	{Loh, Po-Ling and Mazumdar, Arya and Papailiopoulos, Dimitris and Urbanke, R\"{u}diger},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.8.3.60},
  URN =		{urn:nbn:de:0030-drops-92977},
  doi =		{10.4230/DagRep.8.3.60},
  annote =	{Keywords: Coding theory, Distributed optimization, Machine learning, Threshold phenomena}
}
Document
An Entropy Sumset Inequality and Polynomially Fast Convergence to Shannon Capacity Over All Alphabets

Authors: Venkatesan Guruswami and Ameya Velingker

Published in: LIPIcs, Volume 33, 30th Conference on Computational Complexity (CCC 2015)


Abstract
We prove a lower estimate on the increase in entropy when two copies of a conditional random variable X | Y, with X supported on Z_q={0,1,...,q-1} for prime q, are summed modulo q. Specifically, given two i.i.d. copies (X_1,Y_1) and (X_2,Y_2) of a pair of random variables (X,Y), with X taking values in Z_q, we show H(X_1 + X_2 \mid Y_1, Y_2) - H(X|Y) >=e alpha(q) * H(X|Y) (1-H(X|Y)) for some alpha(q) > 0, where H(.) is the normalized (by factor log_2(q)) entropy. In particular, if X | Y is not close to being fully random or fully deterministic and H(X| Y) \in (gamma,1-gamma), then the entropy of the sum increases by Omega_q(gamma). Our motivation is an effective analysis of the finite-length behavior of polar codes, for which the linear dependence on gamma is quantitatively important. The assumption of q being prime is necessary: for X supported uniformly on a proper subgroup of Z_q we have H(X+X)=H(X). For X supported on infinite groups without a finite subgroup (the torsion-free case) and no conditioning, a sumset inequality for the absolute increase in (unnormalized) entropy was shown by Tao in [Tao, CP&R 2010]. We use our sumset inequality to analyze Ari kan's construction of polar codes and prove that for any q-ary source X, where q is any fixed prime, and anyepsilon > 0, polar codes allow efficient data compression of N i.i.d. copies of X into (H(X)+epsilon)N q-ary symbols, as soon as N is polynomially large in 1/epsilon. We can get capacity-achieving source codes with similar guarantees for composite alphabets, by factoring q into primes and combining different polar codes for each prime in factorization. A consequence of our result for noisy channel coding is that for all discrete memoryless channels, there are explicit codes enabling reliable communication within epsilon > 0 of the symmetric Shannon capacity for a block length and decoding complexity bounded by a polynomial in 1/epsilon. The result was previously shown for the special case of binary-input channels [Guruswami/Xial, FOCS'13; Hassani/Alishahi/Urbanke, CoRR 2013], and this work extends the result to channels over any alphabet.

Cite as

Venkatesan Guruswami and Ameya Velingker. An Entropy Sumset Inequality and Polynomially Fast Convergence to Shannon Capacity Over All Alphabets. In 30th Conference on Computational Complexity (CCC 2015). Leibniz International Proceedings in Informatics (LIPIcs), Volume 33, pp. 42-57, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{guruswami_et_al:LIPIcs.CCC.2015.42,
  author =	{Guruswami, Venkatesan and Velingker, Ameya},
  title =	{{An Entropy Sumset Inequality and Polynomially Fast Convergence to Shannon Capacity Over All Alphabets}},
  booktitle =	{30th Conference on Computational Complexity (CCC 2015)},
  pages =	{42--57},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-81-1},
  ISSN =	{1868-8969},
  year =	{2015},
  volume =	{33},
  editor =	{Zuckerman, David},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2015.42},
  URN =		{urn:nbn:de:0030-drops-50755},
  doi =		{10.4230/LIPIcs.CCC.2015.42},
  annote =	{Keywords: Polar codes, polynomial gap to capacity, entropy sumset inequality, arbitrary alphabets}
}
  • Refine by Author
  • 1 Guruswami, Venkatesan
  • 1 Loh, Po-Ling
  • 1 Mazumdar, Arya
  • 1 Papailiopoulos, Dimitris
  • 1 Urbanke, Rüdiger
  • Show More...

  • Refine by Classification

  • Refine by Keyword
  • 1 Coding theory
  • 1 Distributed optimization
  • 1 Machine learning
  • 1 Polar codes
  • 1 Threshold phenomena
  • Show More...

  • Refine by Type
  • 2 document

  • Refine by Publication Year
  • 1 2015
  • 1 2018

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail