Abstract
Suppose Alice holds a uniformly random string X in {0,1}^N and Bob holds a noisy version Y of X where each bit of X is flipped independently with probability epsilon in [0,1/2]. Alice and Bob would like to extract a common random string of minentropy at least k. In this work, we establish the communication versus success probability tradeoff for this problem by giving a protocol and a matching lower bound (under the restriction that the string to be agreed upon is determined by Alice's input X). Specifically, we prove that in order for Alice and Bob to agree on a common string with probability 2^{gamma k} (gamma k >= 1), the optimal communication (up to o(k) terms, and achievable for large N) is precisely (C *(1gamma)  2 * sqrt{ C * (1C) gamma}) * k, where C := 4 * epsilon * (1epsilon). In particular, the optimal communication to achieve Omega(1) agreement probability approaches 4 * epsilon * (1epsilon) * k.
We also consider the case when Y is the output of the binary erasure channel on X, where each bit of Y equals the corresponding bit of X with probability 1epsilon and is otherwise erased (that is, replaced by a "?"). In this case, the communication required becomes (epsilon * (1gamma)  2 * sqrt{ epsilon * (1epsilon) * gamma}) * k. In particular, the optimal communication to achieve Omega(1) agreement probability approaches epsilon * k, and with no communication the optimal agreement probability approaches 2^{ (1sqrt{1epsilon})/(1+sqrt{1epsilon}) * k}.
Our protocols are based on covering codes and extend the approach of (Bogdanov and Mossel, 2011) for the zerocommunication case. Our lower bounds rely on hypercontractive inequalities. For the model of bitflips, our argument extends the approach of (Bogdanov and Mossel, 2011) by allowing communication; for the erasure model, to the best of our knowledge the needed hypercontractivity statement was not studied before, and it was established (given our application) by (Nair and Wang 2015). We also obtain information complexity lower bounds for these tasks, and together with our protocol, they shed light on the recently popular "most informative Boolean function" conjecture of Courtade and Kumar.
BibTeX  Entry
@InProceedings{guruswami_et_al:LIPIcs:2016:5845,
author = {Venkatesan Guruswami and Jaikumar Radhakrishnan},
title = {{Tight Bounds for CommunicationAssisted Agreement Distillation}},
booktitle = {31st Conference on Computational Complexity (CCC 2016)},
pages = {6:16:17},
series = {Leibniz International Proceedings in Informatics (LIPIcs)},
ISBN = {9783959770088},
ISSN = {18688969},
year = {2016},
volume = {50},
editor = {Ran Raz},
publisher = {Schloss DagstuhlLeibnizZentrum fuer Informatik},
address = {Dagstuhl, Germany},
URL = {http://drops.dagstuhl.de/opus/volltexte/2016/5845},
URN = {urn:nbn:de:0030drops58450},
doi = {10.4230/LIPIcs.CCC.2016.6},
annote = {Keywords: communication complexity, covering codes, hypercontractivity, information theory, lower bounds, pseudorandomness}
}
Keywords: 

communication complexity, covering codes, hypercontractivity, information theory, lower bounds, pseudorandomness 
Seminar: 

31st Conference on Computational Complexity (CCC 2016) 
Issue Date: 

2016 
Date of publication: 

18.05.2016 