Testing and Learning Convex Sets in the Ternary Hypercube

Authors Hadley Black , Eric Blais , Nathaniel Harms

Thumbnail PDF


  • Filesize: 0.94 MB
  • 21 pages

Document Identifiers

Author Details

Hadley Black
  • University of California, Los Angeles, CA, USA
Eric Blais
  • University of Waterloo, Canada
Nathaniel Harms
  • EPFL, Lausanne, Switzerland

Cite AsGet BibTex

Hadley Black, Eric Blais, and Nathaniel Harms. Testing and Learning Convex Sets in the Ternary Hypercube. In 15th Innovations in Theoretical Computer Science Conference (ITCS 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 287, pp. 15:1-15:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


We study the problems of testing and learning high-dimensional discrete convex sets. The simplest high-dimensional discrete domain where convexity is a non-trivial property is the ternary hypercube, {-1,0,1}ⁿ. The goal of this work is to understand structural combinatorial properties of convex sets in this domain and to determine the complexity of the testing and learning problems. We obtain the following results. Structural: We prove nearly tight bounds on the edge boundary of convex sets in {0,±1}ⁿ, showing that the maximum edge boundary of a convex set is Õ(n^{3/4})⋅3ⁿ, or equivalently that every convex set has influence Õ(n^{3/4}) and a convex set exists with influence Ω(n^{3/4}). Learning and sample-based testing: We prove upper and lower bounds of 3^{Õ(n^{3/4})} and 3^{Ω(√n)} for the task of learning convex sets under the uniform distribution from random examples. The analysis of the learning algorithm relies on our upper bound on the influence. Both the upper and lower bound also hold for the problem of sample-based testing with two-sided error. For sample-based testing with one-sided error we show that the sample-complexity is 3^{Θ(n)}. Testing with queries: We prove nearly matching upper and lower bounds of 3^{Θ̃(√n)} for one-sided error testing of convex sets with non-adaptive queries.

Subject Classification

ACM Subject Classification
  • Theory of computation → Streaming, sublinear and near linear time algorithms
  • Theory of computation → Randomness, geometry and discrete structures
  • Theory of computation → Computational geometry
  • Property testing
  • learning theory
  • convex sets
  • testing convexity
  • fluctuation


  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    PDF Downloads


  1. Keith Ball. The reverse isoperimetric problem for gaussian measure. Discrete Comput. Geom., 10(4):411-420, December 1993. URL: https://doi.org/10.1007/BF02573986.
  2. Aleksandrs Belovs and Eric Blais. A polynomial lower bound for testing monotonicity. In Proceedings, ACM Symposium on Theory of Computing (STOC), 2016. Google Scholar
  3. Quentin Berger and Loïc Béthencourt. An application of sparre andersen’s fluctuation theorem for exchangeable and sign-invariant random variables. arXiv preprint, 2023. URL: https://arxiv.org/abs/2304.09031.
  4. Piotr Berman, Meiram Murzabulatov, and Sofya Raskhodnikova. The power and limitations of uniform samples in testing properties of figures. Algorithmica, 81(3):1247-1266, 2019. URL: https://doi.org/10.1007/s00453-018-0467-9.
  5. Piotr Berman, Meiram Murzabulatov, and Sofya Raskhodnikova. Testing convexity of figures under the uniform distribution. Random Struct. Algorithms, 54(3):413-443, 2019. URL: https://doi.org/10.1002/rsa.20797.
  6. Piotr Berman, Meiram Murzabulatov, and Sofya Raskhodnikova. Tolerant testers of image properties. ACM Trans. Algorithms, 18(4):37:1-37:39, 2022. URL: https://doi.org/10.1145/3531527.
  7. Eric Blais and Abhinav Bommireddi. On testing and robust characterizations of convexity. In Proceedings, International Workshop on Randomization and Computation (RANDOM), 2020. URL: https://doi.org/10.4230/LIPIcs.APPROX/RANDOM.2020.18.
  8. Eric Blais, Renato Ferreira Pinto Jr, and Nathaniel Harms. VC dimension and distribution-free sample-based testing. In Proceedings, ACM Symposium on Theory of Computing (STOC), 2021. Google Scholar
  9. Béla Bollobás and Imre Leader. Edge-isoperimetric inequalities in the grid. Combinatorica, 11(4):299-314, 1991. Google Scholar
  10. Xi Chen, Adam Freilich, Rocco A. Servedio, and Timothy Sun. Sample-based high-dimensional convexity testing. In Proceedings, International Workshop on Randomization and Computation (RANDOM), 2017. Google Scholar
  11. Xi Chen, Erik Waingarten, and Jinyu Xie. Beyond Talagrand: New lower bounds for testing monotonicity and unateness. In Proceedings, ACM Symposium on Theory of Computing (STOC), 2017. Google Scholar
  12. Anindya De, Shivam Nadimpalli, and Rocco A Servedio. Convex influences. In 13th Innovations in Theoretical Computer Science Conference (ITCS), 2022. Google Scholar
  13. Konrad Engel. Sperner Theory. Cambridge University Press, 1997. Google Scholar
  14. Oded Goldreich, Shafi Goldwasser, and Dana Ron. Property testing and its connection to learning and approximation. Journal of the ACM, 45(4):653-750, 1998. Google Scholar
  15. Nathaniel Harms and Yuichi Yoshida. Downsampling for testing and learning in product distributions. In Proceedings, International Colloquium on Automata, Languages and Programming (ICALP), 2022. Google Scholar
  16. Daniel M. Kane. The average sensitivity of an intersection of half spaces. In David B. Shmoys, editor, Symposium on Theory of Computing, STOC 2014, New York, NY, USA, May 31 - June 03, 2014, pages 437-440. ACM, 2014. URL: https://doi.org/10.1145/2591796.2591798.
  17. Adam R. Klivans, Ryan O'Donnell, and Rocco A. Servedio. Learning geometric concepts via gaussian surface area. In Proceedings, IEEE Symposium on Foundations of Computer Science (FOCS). IEEE Computer Society, 2008. URL: https://doi.org/10.1109/FOCS.2008.64.
  18. Nathan Linial, Yishay Mansour, and Noam Nisan. Constant depth circuits, fourier transform, and learnability. J. ACM, 40(3):607-620, 1993. URL: https://doi.org/10.1145/174130.174138.
  19. Nicholas Metropolis and Gian-Carlo Rota. Combinatorial structure of the faces of the n-cube. SIAM J. Appl. Math., 35(4):689-694, December 1978. Google Scholar
  20. Fedor Nazarov. On the maximal perimeter of a convex set in ℝⁿ with respect to a gaussian measure. In Geometric Aspects of Functional Analysis: Israel Seminar 2001-2002, pages 169-187, 2003. URL: https://api.semanticscholar.org/CorpusID:125334165.
  21. Ryan O'Donnell. Analysis of Boolean Functions. Cambridge University Press, 2014. URL: http://www.cambridge.org/de/academic/subjects/computer-science/algorithmics-complexity-computer-algebra-and-computational-g/analysis-boolean-functions.
  22. Luis Rademacher and Navin Goyal. Learning convex bodies is hard. In Proceedings, Conference on Learning Theory (COLT), 2009. URL: http://www.cs.mcgill.ca/%7Ecolt2009/papers/030.pdf#page=1.
  23. Luis Rademacher and Santosh S. Vempala. Testing geometric convexity. In Proceedings, Foundations of Software Technology and Theoretical Computer Science (FSTTCS). Springer, 2004. URL: https://doi.org/10.1007/978-3-540-30538-5_39.
  24. Sofya Raskhodnikova. Approximate testing of visual properties. In Proceedings, International Workshop on Randomization and Computation (RANDOM). Springer, 2003. URL: https://doi.org/10.1007/978-3-540-45198-3_31.
  25. Bernd Schmeltz. Learning convex sets under uniform distribution. In Data Structures and Efficient Algorithms, Final Report on the DFG Special Joint Initiative, Lecture Notes in Computer Science, pages 204-213. Springer, 1992. URL: https://doi.org/10.1007/3-540-55488-2_28.
  26. Erik Sparre Andersen. On the fluctuations of sums of random variables II. Math. Scandinavica, 2(2):195-223, 1954. Google Scholar
Questions / Remarks / Feedback

Feedback for Dagstuhl Publishing

Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail