LIPIcs.APPROX-RANDOM.2024.37.pdf
- Filesize: 1.03 MB
- 23 pages
We study monotonicity testing of functions f : {0,1}^d → {0,1} using sample-based algorithms, which are only allowed to observe the value of f on points drawn independently from the uniform distribution. A classic result by Bshouty-Tamon (J. ACM 1996) proved that monotone functions can be learned with exp(Õ(min{(1/ε)√d,d})) samples and it is not hard to show that this bound extends to testing. Prior to our work the only lower bound for this problem was Ω(√{exp(d)/ε}) in the small ε parameter regime, when ε = O(d^{-3/2}), due to Goldreich-Goldwasser-Lehman-Ron-Samorodnitsky (Combinatorica 2000). Thus, the sample complexity of monotonicity testing was wide open for ε ≫ d^{-3/2}. We resolve this question, obtaining a nearly tight lower bound of exp(Ω(min{(1/ε)√d,d})) for all ε at most a sufficiently small constant. In fact, we prove a much more general result, showing that the sample complexity of k-monotonicity testing and learning for functions f : {0,1}^d → [r] is exp(Ω(min{(rk/ε)√d,d})). For testing with one-sided error we show that the sample complexity is exp(Ω(d)). Beyond the hypercube, we prove nearly tight bounds (up to polylog factors of d,k,r,1/ε in the exponent) of exp(Θ̃(min{(rk/ε)√d,d})) on the sample complexity of testing and learning measurable k-monotone functions f : ℝ^d → [r] under product distributions. Our upper bound improves upon the previous bound of exp(Õ(min{(k/ε²)√d,d})) by Harms-Yoshida (ICALP 2022) for Boolean functions (r = 2).
Feedback for Dagstuhl Publishing