From TCS to Learning Theory (Invited Paper)

Author Kasper Green Larsen



PDF
Thumbnail PDF

File

LIPIcs.MFCS.2024.4.pdf
  • Filesize: 0.52 MB
  • 9 pages

Document Identifiers

Author Details

Kasper Green Larsen
  • Aarhus University, Denmark

Cite AsGet BibTex

Kasper Green Larsen. From TCS to Learning Theory (Invited Paper). In 49th International Symposium on Mathematical Foundations of Computer Science (MFCS 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 306, pp. 4:1-4:9, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)
https://doi.org/10.4230/LIPIcs.MFCS.2024.4

Abstract

While machine learning theory and theoretical computer science are both based on a solid mathematical foundation, the two research communities have a smaller overlap than what the proximity of the fields warrant. In this invited abstract, I will argue that traditional theoretical computer scientists have much to offer the learning theory community and vice versa. I will make this argument by telling a personal story of how I broadened my research focus to encompass learning theory, and how my TCS background has been extremely useful in doing so. It is my hope that this personal account may inspire more TCS researchers to tackle the many elegant and important theoretical questions that learning theory has to offer.

Subject Classification

ACM Subject Classification
  • Theory of computation
Keywords
  • Theoretical Computer Science
  • Learning Theory

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Peyman Afshani, Casper Benjamin Freksen, Lior Kamma, and Kasper Green Larsen. Lower bounds for multiplication via network coding. In ICALP, volume 132 of LIPIcs, pages 10:1-10:12. Schloss Dagstuhl - Leibniz-Zentrum für Informatik, 2019. Google Scholar
  2. Alok Aggarwal and S. Vitter, Jeffrey. The input/output complexity of sorting and related problems. Commun. ACM, 31(9):1116-1127, September 1988. URL: https://doi.org/10.1145/48529.48535.
  3. R. Alexander. Geometric methods in the study of irregularities of distribution. Combinatorica, 10(2):115-136, 1990. Google Scholar
  4. Noga Alon and Bo'az Klartag. Optimal compression of approximate inner products and dimension reduction. In 2017 IEEE 58th Annual Symposium on Foundations of Computer Science (FOCS), pages 639-650, 2017. URL: https://doi.org/10.1109/FOCS.2017.65.
  5. Lars Arge. The buffer tree: A new technique for optimal i/o-algorithms. In Selim G. Akl, Frank Dehne, Jörg-Rüdiger Sack, and Nicola Santoro, editors, Algorithms and Data Structures, pages 334-345, Berlin, Heidelberg, 1995. Springer Berlin Heidelberg. Google Scholar
  6. Wojciech Banaszczyk. Balancing vectors and gaussian measures of n-dimensional convex bodies. Random Structures & Algorithms, 12:351-360, July 1998. Google Scholar
  7. Nikhil Bansal. Constructive algorithms for discrepancy minimization. In Proc. 51th Annual IEEE Symposium on Foundations of Computer Science (FOCS'10), pages 3-10, 2010. Google Scholar
  8. Peter Bartlett, Yoav Freund, Wee Sun Lee, and Robert E. Schapire. Boosting the margin: a new explanation for the effectiveness of voting methods. The Annals of Statistics, 26(5):1651-1686, 1998. Google Scholar
  9. Peter Bartlett and John Shawe-Taylor. Generalization performance of support vector machines and other pattern classifiers, pages 43-54. MIT Press, Cambridge, MA, USA, 1999. Google Scholar
  10. J. Beck and T. Fiala. Integer-making theorems. Discrete Applied Mathematics, 3:1-8, February 1981. Google Scholar
  11. Mark Braverman, Sumegha Garg, and Ariel Schvartzman. Coding in undirected graphs is either very helpful or not helpful at all. In 8th Innovations in Theoretical Computer Science Conference, ITCS 2017, January 9-11, 2017, Berkeley, CA, USA, pages 18:1-18:18, 2017. Google Scholar
  12. Mark Bun, Marco Gaboardi, Max Hopkins, Russell Impagliazzo, Rex Lei, Toniann Pitassi, Satchit Sivakumar, and Jessica Sorrell. Stability is stable: Connections between replicability, privacy, and adaptive generalization. In Proceedings of the 55th Annual ACM Symposium on Theory of Computing, pages 520-527, 2023. Google Scholar
  13. Moses Charikar, Alantha Newman, and Aleksandar Nikolov. Tight hardness results for minimizing discrepancy. In Dana Randall, editor, Proceedings of the Twenty-Second Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2011, San Francisco, California, USA, January 23-25, 2011, pages 1607-1614. SIAM, 2011. URL: https://doi.org/10.1137/1.9781611973082.124.
  14. Corinna Cortes and Vladimir Vapnik. Support-vector networks. Machine learning, 20:273-297, 1995. Google Scholar
  15. Alireza Farhadi, MohammadTaghi Hajiaghayi, Kasper Green Larsen, and Elaine Shi. Lower bounds for external memory integer sorting via network coding. In STOC, pages 997-1008. ACM, 2019. Google Scholar
  16. Yoav Freund and Robert E Schapire. A decision-theoretic generalization of on-line learning and an application to boosting. Journal of computer and system sciences, 55(1):119-139, 1997. Google Scholar
  17. Oded Goldreich and Rafail Ostrovsky. Software protection and simulation on oblivious RAMs. Journal of the ACM (JACM), 1996. Google Scholar
  18. Allan Grønlund, Lior Kamma, and Kasper Green Larsen. Near-tight margin-based generalization bounds for support vector machines. In ICML, volume 119 of Proceedings of Machine Learning Research, pages 3779-3788. PMLR, 2020. Google Scholar
  19. Nika Haghtalab, Michael I. Jordan, and Eric Zhao. On-demand sampling: Learning optimally from multiple distributions. In Sanmi Koyejo, S. Mohamed, A. Agarwal, Danielle Belgrave, K. Cho, and A. Oh, editors, Advances in Neural Information Processing Systems 35: Annual Conference on Neural Information Processing Systems 2022, NeurIPS 2022, New Orleans, LA, USA, November 28 - December 9, 2022, 2022. URL: http://papers.nips.cc/paper_files/paper/2022/hash/02917acec264a52a729b99d9bc857909-Abstract-Conference.html.
  20. Russell Impagliazzo, Rex Lei, Toniann Pitassi, and Jessica Sorrell. Reproducibility in learning. In Stefano Leonardi and Anupam Gupta, editors, STOC '22: 54th Annual ACM SIGACT Symposium on Theory of Computing, Rome, Italy, June 20 - 24, 2022, pages 818-831. ACM, 2022. URL: https://doi.org/10.1145/3519935.3519973.
  21. William Johnson and Joram Lindenstrauss. Extensions of lipschitz maps into a hilbert space. Contemporary Mathematics, 26:189-206, January 1984. URL: https://doi.org/10.1090/conm/026/737400.
  22. Alkis Kalavasis, Amin Karbasi, Kasper Green Larsen, Grigoris Velegkas, and Felix Zhou. Replicable learning of large-margin halfspaces. In ICML, Proceedings of Machine Learning Research. PMLR, 2024. To appear. Google Scholar
  23. Kasper Green Larsen. The cell probe complexity of dynamic range counting. In STOC, pages 85-94. ACM, 2012. Google Scholar
  24. Kasper Green Larsen. Higher cell probe lower bounds for evaluating polynomials. In FOCS, pages 293-301. IEEE Computer Society, 2012. Google Scholar
  25. Kasper Green Larsen. On range searching in the group model and combinatorial discrepancy. SIAM J. Comput., 43(2):673-686, 2014. Google Scholar
  26. Kasper Green Larsen and Jelani Nelson. Optimality of the johnson-lindenstrauss lemma. In FOCS, pages 633-638. IEEE Computer Society, 2017. Google Scholar
  27. Kasper Green Larsen, Jelani Nelson, and Huy L. Nguyên. Time lower bounds for nonadaptive turnstile streaming algorithms. In STOC, pages 803-812. ACM, 2015. Google Scholar
  28. Kasper Green Larsen and Jesper Buus Nielsen. Yes, there is an oblivious RAM lower bound! In CRYPTO (2), volume 10992 of Lecture Notes in Computer Science, pages 523-542. Springer, 2018. Google Scholar
  29. Kasper Green Larsen, Omri Weinstein, and Huacheng Yu. Crossing the logarithmic barrier for dynamic boolean data structure lower bounds. In STOC, pages 978-989. ACM, 2018. Google Scholar
  30. Zongpeng Li and Baochun Li. Network coding: the case of multiple unicast sessions. In Proceedings of the 42nd Allerton Annual Conference on Communication, Control and Computing, Allerton'04, 2004. Google Scholar
  31. Shachar Lovett and Raghu Meka. Constructive discrepancy minimization by walking on the edges. SIAM Journal on Computing, 44(5):1573-1582, 2015. Google Scholar
  32. Alexander Mathiasen, Kasper Green Larsen, and Allan Grønlund. Optimal minimal margin maximization with boosting. In ICML, volume 97 of Proceedings of Machine Learning Research, pages 4392-4401. PMLR, 2019. Google Scholar
  33. Mihai Pǎtraşcu. Unifying the landscape of cell-probe lower bounds. SIAM Journal on Computing, 40(3):827-847, 2011. URL: https://doi.org/10.1137/09075336X.
  34. Joel Spencer. Six standard deviations suffice. Trans. Amer. Math. Soc., 289:679-706, 1985. Google Scholar
  35. Andrew Chi-Chih Yao. Should tables be sorted? J. ACM, 28(3):615-628, July 1981. URL: https://doi.org/10.1145/322261.322274.
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail