This paper presents the following results on sets that are complete for $\NP$. \begin{enumerate} \item If there is a problem in $\NP$ that requires $\twonO$ time at almost all lengths, then every many-one NP-complete set is complete under length-increasing reductions that are computed by polynomial-size circuits. \item If there is a problem in $\CoNP$ that cannot be solved by polynomial-size nondeterministic circuits, then every many-one complete set is complete under length-increasing reductions that are computed by polynomial-size circuits. \item If there exist a one-way permutation that is secure against subexponential-size circuits and there is a hard tally language in $\NP \cap \CoNP$, then there is a Turing complete language for $\NP$ that is not many-one complete. \end{enumerate} Our first two results use worst-case hardness hypotheses whereas earlier work that showed similar results relied on average-case or almost-everywhere hardness assumptions. The use of average-case and worst-case hypotheses in the last result is unique as previous results obtaining the same consequence relied on almost-everywhere hardness results.
@InProceedings{gu_et_al:LIPIcs.STACS.2010.2462, author = {Gu, Xiaoyang and Hitchcock, John M. and Pavan, Aduri}, title = {{Collapsing and Separating Completeness Notions under Average-Case and Worst-Case Hypotheses}}, booktitle = {27th International Symposium on Theoretical Aspects of Computer Science}, pages = {429--440}, series = {Leibniz International Proceedings in Informatics (LIPIcs)}, ISBN = {978-3-939897-16-3}, ISSN = {1868-8969}, year = {2010}, volume = {5}, editor = {Marion, Jean-Yves and Schwentick, Thomas}, publisher = {Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik}, address = {Dagstuhl, Germany}, URL = {https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2010.2462}, URN = {urn:nbn:de:0030-drops-24627}, doi = {10.4230/LIPIcs.STACS.2010.2462}, annote = {Keywords: Computational complexity, NP-completeness} }
Feedback for Dagstuhl Publishing