LIPIcs.STACS.2013.634.pdf
- Filesize: 0.58 MB
- 12 pages
We prove a lower bound on the amount of nonuniform advice needed by black-box reductions for the Dense Model Theorem of Green, Tao, and Ziegler, and of Reingold, Trevisan, Tulsiani, and Vadhan. The latter theorem roughly says that for every distribution D that is delta-dense in a distribution that is epsilon'-indistinguishable from uniform, there exists a "dense model" for D, that is, a distribution that is delta-dense in the uniform distribution and is epsilon-indistinguishable from D. This epsilon-indistinguishability is with respect to an arbitrary small class of functions F. For the natural case where epsilon' >= Omega(epsilon delta) and epsilon >= delta^{O(1)}, our lower bound implies that Omega(sqrt{(1/epsilon)log(1/delta)} log|F|) advice bits are necessary. There is only a polynomial gap between our lower bound and the best upper bound for this case (due to Zhang), which is O((1/epsilon^2)log(1/delta) log|F|). Our lower bound can be viewed as an analog of list size lower bounds for list-decoding of error-correcting codes, but for "dense model decoding" instead. Our proof introduces some new techniques which may be of independent interest, including an analysis of a majority of majorities of p-biased bits. The latter analysis uses an extremely tight lower bound on the tail of the binomial distribution, which we could not find in the literature.
Feedback for Dagstuhl Publishing