LIPIcs.APPROX-RANDOM.2023.36.pdf
- Filesize: 0.9 MB
- 21 pages
A circuit 𝒞 samples a distribution X with an error ε if the statistical distance between the output of 𝒞 on the uniform input and X is ε. We study the hardness of sampling a uniform distribution over the set of n-bit strings of Hamming weight k denoted by Uⁿ_k for decision forests, i.e. every output bit is computed as a decision tree of the inputs. For every k there is an O(log n)-depth decision forest sampling Uⁿ_k with an inverse-polynomial error [Emanuele Viola, 2012; Czumaj, 2015]. We show that for every ε > 0 there exists τ such that for decision depth τ log (n/k) / log log (n/k), the error for sampling U_kⁿ is at least 1-ε. Our result is based on the recent robust sunflower lemma [Ryan Alweiss et al., 2021; Rao, 2019]. Our second result is about matching a set of n-bit strings with the image of a d-local circuit, i.e. such that each output bit depends on at most d input bits. We study the set of all n-bit strings whose Hamming weight is at least n/2. We improve the previously known locality lower bound from Ω(log^* n) [Beyersdorff et al., 2013] to Ω(√log n), leaving only a quartic gap from the best upper bound of O(log² n).
Feedback for Dagstuhl Publishing