LIPIcs.ESA.2023.64.pdf
- Filesize: 0.7 MB
- 16 pages
Boosting is a general method to convert a weak learner (which generates hypotheses that are just slightly better than random) into a strong learner (which generates hypotheses that are much better than random). Recently, Arunachalam and Maity [Srinivasan Arunachalam and Reevu Maity, 2020] gave the first quantum improvement for boosting, by combining Freund and Schapire’s AdaBoost algorithm with a quantum algorithm for approximate counting. Their booster is faster than classical boosting as a function of the VC-dimension of the weak learner’s hypothesis class, but worse as a function of the quality of the weak learner. In this paper we give a substantially faster and simpler quantum boosting algorithm, based on Servedio’s SmoothBoost algorithm [Servedio, 2003].
Feedback for Dagstuhl Publishing