Abstract
The hardness vs. randomness paradigm aims to explicitly construct pseudorandom generators G:{0,1}^r → {0,1}^m that fool circuits of size m, assuming the existence of explicit hard functions. A "highend PRG" with seed length r = O(log m) (implying BPP=P) was achieved in a seminal work of Impagliazzo and Wigderson (STOC 1997), assuming the highend hardness assumption: there exist constants 0 < β < 1 < B, and functions computable in time 2^{B ⋅ n} that cannot be computed by circuits of size 2^{β ⋅ n}.
Recently, motivated by fast derandomization of randomized algorithms, Doron et al. (FOCS 2020) and Chen and Tell (STOC 2021), construct "extreme highend PRGs" with seed length r = (1+o(1))⋅ log m, under qualitatively stronger assumptions.
We study whether extreme highend PRGs can be constructed from the corresponding hardness assumption in which β = 1o(1) and B = 1+o(1), which we call the extreme highend hardness assumption. We give a partial negative answer:
 The construction of Doron et al. composes a PEG (pseudoentropy generator) with an extractor. The PEG is constructed starting from a function that is hard for MAtype circuits. We show that blackbox PEG constructions from the extreme highend hardness assumption must have large seed length (and so cannot be used to obtain extreme highend PRGs by applying an extractor).
To prove this, we establish a new property of (general) blackbox PRG constructions from hard functions: it is possible to fix many output bits of the construction while fixing few bits of the hard function. This property distinguishes PRG constructions from typical extractor constructions, and this may explain why it is difficult to design PRG constructions.
 The construction of Chen and Tell composes two PRGs: G₁:{0,1}^{(1+o(1)) ⋅ log m} → {0,1}^{r₂ = m^{Ω(1)}} and G₂:{0,1}^{r₂} → {0,1}^m. The first PRG is constructed from the extreme highend hardness assumption, and the second PRG needs to run in time m^{1+o(1)}, and is constructed assuming one way functions. We show that in blackbox proofs of hardness amplification to 1/2+1/m, reductions must make Ω(m) queries, even in the extreme highend. Known PRG constructions from hard functions are blackbox and use (or imply) hardness amplification, and so cannot be used to construct a PRG G₂ from the extreme highend hardness assumption.
The new feature of our hardness amplification result is that it applies even to the extreme highend setting of parameters, whereas past work does not. Our techniques also improve recent lower bounds of RonZewi, Shaltiel and Varma (ITCS 2021) on the number of queries of local listdecoding algorithms.
BibTeX  Entry
@InProceedings{shaltiel_et_al:LIPIcs.ITCS.2022.116,
author = {Shaltiel, Ronen and Viola, Emanuele},
title = {{On Hardness Assumptions Needed for "Extreme HighEnd" PRGs and Fast Derandomization}},
booktitle = {13th Innovations in Theoretical Computer Science Conference (ITCS 2022)},
pages = {116:1116:17},
series = {Leibniz International Proceedings in Informatics (LIPIcs)},
ISBN = {9783959772174},
ISSN = {18688969},
year = {2022},
volume = {215},
editor = {Braverman, Mark},
publisher = {Schloss Dagstuhl  LeibnizZentrum f{\"u}r Informatik},
address = {Dagstuhl, Germany},
URL = {https://drops.dagstuhl.de/opus/volltexte/2022/15712},
URN = {urn:nbn:de:0030drops157122},
doi = {10.4230/LIPIcs.ITCS.2022.116},
annote = {Keywords: Complexity Theory, Derandomization, Pseudorandom generators, Blackbox proofs}
}
Keywords: 

Complexity Theory, Derandomization, Pseudorandom generators, Blackbox proofs 
Collection: 

13th Innovations in Theoretical Computer Science Conference (ITCS 2022) 
Issue Date: 

2022 
Date of publication: 

25.01.2022 