Superpolynomial Lower Bounds for Learning Monotone Classes

Author Nader H. Bshouty



PDF
Thumbnail PDF

File

LIPIcs.APPROX-RANDOM.2023.34.pdf
  • Filesize: 0.78 MB
  • 20 pages

Document Identifiers

Author Details

Nader H. Bshouty
  • Department of Computer Science, Technion, Haifa, Israel

Acknowledgements

I would like to express my sincere gratitude to the reviewers of RANDOM for their valuable comments on this research. Their feedback was greatly appreciated and helped improve the quality of this work.

Cite AsGet BibTex

Nader H. Bshouty. Superpolynomial Lower Bounds for Learning Monotone Classes. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 275, pp. 34:1-34:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)
https://doi.org/10.4230/LIPIcs.APPROX/RANDOM.2023.34

Abstract

Koch, Strassle, and Tan [SODA 2023], show that, under the randomized exponential time hypothesis, there is no distribution-free PAC-learning algorithm that runs in time n^Õ(log log s) for the classes of n-variable size-s DNF, size-s Decision Tree, and log s-Junta by DNF (that returns a DNF hypothesis). Assuming a natural conjecture on the hardness of set cover, they give the lower bound n^Ω(log s). This matches the best known upper bound for n-variable size-s Decision Tree, and log s-Junta. In this paper, we give the same lower bounds for PAC-learning of n-variable size-s Monotone DNF, size-s Monotone Decision Tree, and Monotone log s-Junta by DNF. This solves the open problem proposed by Koch, Strassle, and Tan and subsumes the above results. The lower bound holds, even if the learner knows the distribution, can draw a sample according to the distribution in polynomial time, and can compute the target function on all the points of the support of the distribution in polynomial time.

Subject Classification

ACM Subject Classification
  • Theory of computation
Keywords
  • PAC Learning
  • Monotone DNF
  • Monotone Decision Tree
  • Monotone Junta
  • Lower Bound

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Michael Alekhnovich, Mark Braverman, Vitaly Feldman, Adam R. Klivans, and Toniann Pitassi. The complexity of properly learning simple concept classes. J. Comput. Syst. Sci., 74(1):16-34, 2008. URL: https://doi.org/10.1016/j.jcss.2007.04.011.
  2. Dana Angluin. Queries and concept learning. Machine Learning, 2(4):319-342, 1987. Google Scholar
  3. Chris Calabro, Russell Impagliazzo, Valentine Kabanets, and Ramamohan Paturi. The complexity of unique k-sat: An isolation lemma for k-cnfs. J. Comput. Syst. Sci., 74(3):386-393, 2008. URL: https://doi.org/10.1016/j.jcss.2007.06.015.
  4. Holger Dell, Thore Husfeldt, Dániel Marx, Nina Taslaman, and Martin Wahlen. Exponential time complexity of the permanent and the tutte polynomial. ACM Trans. Algorithms, 10(4):21:1-21:32, 2014. URL: https://doi.org/10.1145/2635812.
  5. Andrzej Ehrenfeucht and David Haussler. Learning decision trees from random examples. Inf. Comput., 82(3):231-246, 1989. URL: https://doi.org/10.1016/0890-5401(89)90001-1.
  6. Thomas R. Hancock, Tao Jiang, Ming Li, and John Tromp. Lower bounds on learning decision lists and trees. Inf. Comput., 126(2):114-122, 1996. URL: https://doi.org/10.1006/inco.1996.0040.
  7. Lisa Hellerstein, Devorah Kletenik, Linda Sellie, and Rocco A. Servedio. Tight bounds on proper equivalence query learning of DNF. In COLT 2012 - The 25th Annual Conference on Learning Theory, June 25-27, 2012, Edinburgh, Scotland, pages 31.1-31.18, 2012. URL: http://proceedings.mlr.press/v23/hellerstein12/hellerstein12.pdf.
  8. Russell Impagliazzo and Ramamohan Paturi. On the complexity of k-sat. J. Comput. Syst. Sci., 62(2):367-375, 2001. URL: https://doi.org/10.1006/jcss.2000.1727.
  9. Russell Impagliazzo, Ramamohan Paturi, and Francis Zane. Which problems have strongly exponential complexity? J. Comput. Syst. Sci., 63(4):512-530, 2001. URL: https://doi.org/10.1006/jcss.2001.1774.
  10. Caleb Koch, Carmen Strassle, and Li-Yang Tan. Superpolynomial lower bounds for decision tree learning and testing. CoRR, abs/2210.06375, 2022. URL: https://doi.org/10.48550/arXiv.2210.06375.
  11. Bingkai Lin. A simple gap-producing reduction for the parameterized set cover problem. In Christel Baier, Ioannis Chatzigiannakis, Paola Flocchini, and Stefano Leonardi, editors, 46th International Colloquium on Automata, Languages, and Programming, ICALP 2019, July 9-12, 2019, Patras, Greece, volume 132 of LIPIcs, pages 81:1-81:15. Schloss Dagstuhl - Leibniz-Zentrum für Informatik, 2019. URL: https://doi.org/10.4230/LIPIcs.ICALP.2019.81.
  12. Craig A. Tovey. A simplified np-complete satisfiability problem. Discret. Appl. Math., 8(1):85-89, 1984. URL: https://doi.org/10.1016/0166-218X(84)90081-7.
  13. Leslie G. Valiant. A theory of the learnable. Commun. ACM, 27(11):1134-1142, 1984. URL: https://doi.org/10.1145/1968.1972.
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail