Lower Bounds on Black-Box Reductions of Hitting to Density Estimation

Author Roei Tell



PDF
Thumbnail PDF

File

LIPIcs.STACS.2018.58.pdf
  • Filesize: 0.53 MB
  • 13 pages

Document Identifiers

Author Details

Roei Tell

Cite AsGet BibTex

Roei Tell. Lower Bounds on Black-Box Reductions of Hitting to Density Estimation. In 35th Symposium on Theoretical Aspects of Computer Science (STACS 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 96, pp. 58:1-58:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)
https://doi.org/10.4230/LIPIcs.STACS.2018.58

Abstract

Consider a deterministic algorithm that tries to find a string in an unknown set S\subseteq{0,1}^n, under the promise that S has large density. The only information that the algorithm can obtain about S is estimates of the density of S in adaptively chosen subsets of {0,1}^n, up to an additive error of mu>0. This problem is appealing as a derandomization problem, when S is the set of satisfying inputs for a circuit C:{0,1}^n->{0,1} that accepts many inputs: In this context, an algorithm as above constitutes a deterministic black-box reduction of the problem of hitting C (i.e., finding a satisfying input for C) to the problem of approximately counting the number of satisfying inputs for C on subsets of {0,1}^n. We prove tight lower bounds for this problem, demonstrating that naive approaches to solve the problem cannot be improved upon, in general. First, we show a tight trade-off between the estimation error mu and the required number of queries to solve the problem: When mu=O(log(n)/n) a polynomial number of queries suffices, and when mu>=(4log(n)/n) the required number of queries is 2^{Theta(mu \cdot n)}. Secondly, we show that the problem "resists" parallelization: Any algorithm that works in iterations, and can obtain p=p(n) density estimates "in parallel" in each iteration, still requires Omega( frac{n}{log(p)+log(1/mu)} ) iterations to solve the problem. This work extends the well-known work of Karp, Upfal, and Wigderson (1988), who studied the setting in which S is only guaranteed to be non-empty (rather than dense), and the algorithm can only probe subsets for the existence of a solution in them. In addition, our lower bound on parallel algorithms affirms a weak version of a conjecture of Motwani, Naor, and Naor (1994); we also make progress on a stronger version of their conjecture.
Keywords
  • Approximate Counting
  • Lower Bounds
  • Derandomization
  • Parallel Algorithms
  • Query Complexity

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Oded Goldreich. In a world of p=bpp. In Studies in Complexity and Cryptography. Miscellanea on the Interplay between Randomness and Computation - In Collaboration with Lidor Avigad, Mihir Bellare, Zvika Brakerski, Shafi Goldwasser, Shai Halevi, Tali Kaufman, Leonid Levin, Noam Nisan, Dana Ron, Madhu Sudan, Luca Trevisan, Salil Vadhan, Avi Wigderson, David Zuckerman, pages 191-232. Springer, 2011. URL: http://dx.doi.org/10.1007/978-3-642-22670-0_20.
  2. Oded Goldreich and Avi Wigderson. On derandomizing algorithms that err extremely rarely. In Symposium on Theory of Computing, STOC 2014, New York, NY, USA, May 31 - June 03, 2014, pages 109-118, 2014. URL: http://dx.doi.org/10.1145/2591796.2591808.
  3. R. Impagliazzo and A. Wigderson. Randomness vs. time: De-randomization under a uniform assumption. In Proc. 39th Annual IEEE Symposium on Foundations of Computer Science (FOCS), pages 734-, 1998. Google Scholar
  4. Russell Impagliazzo, Valentine Kabanets, and Avi Wigderson. In search of an easy witness: exponential time vs. probabilistic polynomial time. Journal of Computer and System Sciences, 65(4):672-694, 2002. Google Scholar
  5. Valentine Kabanets and Russell Impagliazzo. Derandomizing polynomial identity tests means proving circuit lower bounds. Computational Complexity, 13(1-2):1-46, 2004. Google Scholar
  6. Richard M. Karp, Eli Upfal, and Avi Wigderson. The complexity of parallel search. Journal of Computer and System Sciences, 36(2):225-253, 1988. Google Scholar
  7. Rajeev Motwani, Joseph Naor, and Moni Naor. The probabilistic method yields deterministic parallel algorithms. Journal of Computer and System Sciences, 49(3):478-516, 1994. Google Scholar
  8. Roei Tell. Lower bounds on black-box reductions of hitting to density estimation. Electronic Colloquium on Computational Complexity: ECCC, 23:50, 2016. Google Scholar
  9. Roei Tell. Improved bounds for quantified derandomization of constant-depth circuits and polynomials. In Proc. 32nd Annual IEEE Conference on Computational Complexity (CCC), pages 18:1-18:49, 2017. Google Scholar
  10. Roei Tell. Quantified derandomization of linear threshold circuits. Electronic Colloquium on Computational Complexity: ECCC, 24:145, 2017. Google Scholar
  11. Luca Trevisan and Salil P. Vadhan. Pseudorandomness and average-case complexity via uniform reductions. Computational Complexity, 16(4):331-364, 2007. Google Scholar
  12. Salil P. Vadhan. Pseudorandomness. Foundations and Trends in Theoretical Computer Science. Now Publishers, 2012. Google Scholar
  13. Emanuele Viola. The complexity of constructing pseudorandom generators from hard functions. Computational Complexity, 13(3-4):147-188, 2005. Google Scholar
  14. Ryan Williams. Improving exhaustive search implies superpolynomial lower bounds. SIAM Journal of Computing, 42(3):1218-1244, 2013. Google Scholar
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail