Boosting is a general method to convert a weak learner (which generates hypotheses that are just slightly better than random) into a strong learner (which generates hypotheses that are much better than random). Recently, Arunachalam and Maity [Srinivasan Arunachalam and Reevu Maity, 2020] gave the first quantum improvement for boosting, by combining Freund and Schapire’s AdaBoost algorithm with a quantum algorithm for approximate counting. Their booster is faster than classical boosting as a function of the VC-dimension of the weak learner’s hypothesis class, but worse as a function of the quality of the weak learner. In this paper we give a substantially faster and simpler quantum boosting algorithm, based on Servedio’s SmoothBoost algorithm [Servedio, 2003].
@InProceedings{izdebski_et_al:LIPIcs.ESA.2023.64, author = {Izdebski, Adam and de Wolf, Ronald}, title = {{Improved Quantum Boosting}}, booktitle = {31st Annual European Symposium on Algorithms (ESA 2023)}, pages = {64:1--64:16}, series = {Leibniz International Proceedings in Informatics (LIPIcs)}, ISBN = {978-3-95977-295-2}, ISSN = {1868-8969}, year = {2023}, volume = {274}, editor = {G{\o}rtz, Inge Li and Farach-Colton, Martin and Puglisi, Simon J. and Herman, Grzegorz}, publisher = {Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik}, address = {Dagstuhl, Germany}, URL = {https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2023.64}, URN = {urn:nbn:de:0030-drops-187178}, doi = {10.4230/LIPIcs.ESA.2023.64}, annote = {Keywords: Learning theory, Boosting algorithms, Quantum computing} }
Feedback for Dagstuhl Publishing