Quantum Algorithms and Lower Bounds for Linear Regression with Norm Constraints

Authors Yanlin Chen, Ronald de Wolf



PDF
Thumbnail PDF

File

LIPIcs.ICALP.2023.38.pdf
  • Filesize: 0.78 MB
  • 21 pages

Document Identifiers

Author Details

Yanlin Chen
  • QuSoft and CWI, Amsterdam, The Netherlands
Ronald de Wolf
  • QuSoft and CWI, Amsterdam, The Netherlands
  • University of Amsterdam, The Netherlands

Acknowledgements

We thank Yi-Shan Wu and Christian Majenz for useful discussions, and Armando Bellante for pointing us to [Armando Bellante and Stefano Zanero, 2022].

Cite AsGet BibTex

Yanlin Chen and Ronald de Wolf. Quantum Algorithms and Lower Bounds for Linear Regression with Norm Constraints. In 50th International Colloquium on Automata, Languages, and Programming (ICALP 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 261, pp. 38:1-38:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)
https://doi.org/10.4230/LIPIcs.ICALP.2023.38

Abstract

Lasso and Ridge are important minimization problems in machine learning and statistics. They are versions of linear regression with squared loss where the vector θ ∈ ℝ^d of coefficients is constrained in either 𝓁₁-norm (for Lasso) or in 𝓁₂-norm (for Ridge). We study the complexity of quantum algorithms for finding ε-minimizers for these minimization problems. We show that for Lasso we can get a quadratic quantum speedup in terms of d by speeding up the cost-per-iteration of the Frank-Wolfe algorithm, while for Ridge the best quantum algorithms are linear in d, as are the best classical algorithms. As a byproduct of our quantum lower bound for Lasso, we also prove the first classical lower bound for Lasso that is tight up to polylog-factors.

Subject Classification

ACM Subject Classification
  • Mathematics of computing → Mathematical optimization
  • Theory of computation → Quantum computation theory
Keywords
  • Quantum algorithms
  • Regularized linear regression
  • Lasso
  • Ridge
  • Lower bounds

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Jonathan Allcock and Chang-Yu Hsieh. A quantum extension of SVM-perf for training nonlinear SVMs in almost linear time. Quantum, 4:342, 2020. URL: https://arxiv.org/abs/2006.10299.
  2. Joran van Apeldoorn. A quantum view on convex optimization. PhD thesis, Universiteit van Amsterdam, 2020. Google Scholar
  3. Joran van Apeldoorn and András Gilyén. Quantum algorithms for zero-sum games, 2019. URL: https://arxiv.org/abs/1904.03180.
  4. Joran van Apeldoorn, András Gilyén, Sander Gribling, and Ronald de Wolf. Convex optimization using quantum oracles. Quantum, 4:220, 2020. URL: https://arxiv.org/abs/1809.00643.
  5. Joran van Apeldoorn, András Gilyén, Sander Gribling, and Ronald de Wolf. Quantum SDP-solvers: better upper and lower bounds. Quantum, 4:230, 2020. Earlier version in FOCS'17. URL: https://arxiv.org/abs/1705.01843.
  6. Joran van Apeldoorn and András Gilyén. Improvements in quantum SDP-solving with applications. In Proceedings of 46th International Colloquium on Automata, Languages, and Programming, volume 132 of Leibniz International Proceedings in Informatics, pages 99:1-99:15, 2019. URL: https://arxiv.org/abs/1804.05058.
  7. Joran van Apeldoorn, Sander Gribling, Yinan Li, Harold Nieuwboer, Michael Walter, and Ronald de Wolf. Quantum algorithms for matrix scaling and matrix balancing. In Proceedings of 48th International Colloquium on Automata, Languages, and Programming, volume 198 of Leibniz International Proceedings in Informatics, pages 110:1-17, 2021. URL: https://arxiv.org/abs/2011.12823.
  8. Simon Apers and Ronald de Wolf. Quantum speedup for graph sparsification, cut approximation and Laplacian solving. In Proceedings of 61st IEEE Annual Symposium on Foundations of Computer Science, pages 637-648, 2020. URL: https://arxiv.org/abs/1911.07306.
  9. Srinivasan Arunachalam and Reevu Maity. Quantum boosting. In Proceedings of 37th International Conference on Machine Learning (ICML'20), 2020. URL: https://arxiv.org/abs/2002.05056.
  10. Robert Beals, Harry Buhrman, Richard Cleve, Michele Mosca, and Ronald de Wolf. Quantum lower bounds by polynomials. Journal of the ACM, 48(4):778-797, 2001. Earlier version in FOCS'98. quant-ph/9802049. Google Scholar
  11. Armando Bellante and Stefano Zanero. Quantum matching pursuit: A quantum algorithm for sparse representations. Physical Review A, 105:022414, 2022. Google Scholar
  12. Aleksandrs Belovs and Troy Lee. The quantum query complexity of composition with a relation, 2020. URL: https://arxiv.org/abs/2004.06439.
  13. Fernando Brandão, Amir Kalev, Tongyang Li, Cedric Yen-Yu Lin, Krysta Svore, and Xiaodi Wu. Quantum SDP solvers: Large speed-ups, optimality, and applications to quantum learning. In Proceedings of 46th International Colloquium on Automata, Languages, and Programming, volume 132 of Leibniz International Proceedings in Informatics, pages 27:1-27:14, 2019. URL: https://arxiv.org/abs/1710.02581.
  14. Fernando Brandão and Krysta Svore. Quantum speed-ups for solving semidefinite programs. In Proceedings of 58th IEEE Annual Symposium on Foundations of Computer Science, FOCS, pages 415-426, 2017. URL: https://arxiv.org/abs/1609.05537.
  15. Peter Bühlmann and Sara van de Geer. Statistics for High-Dimensional Data: Methods, Theory and Applications. Springer, 2011. Google Scholar
  16. Nicolò Cesa-Bianchi, Shai Shalev-Shwartz, and Ohad Shamir. Efficient learning with partially observed attributes. Journal of Machine Learning Research, 12:2857-2878, 2011. URL: https://arxiv.org/abs/1004.4421.
  17. Shouvanik Chakrabarti, Andrew Childs, Tongyang Li, and Xiaodi Wu. Quantum algorithms and lower bounds for convex optimization. Quantum, 4:221, 2020. URL: https://arxiv.org/abs/1809.01731.
  18. Shantanav Chakraborty, András Gilyén, and Stacey Jeffery. The power of block-encoded matrix powers: improved regression techniques via faster Hamiltonian simulation. In Proceedings of 46th International Colloquium on Automata, Languages, and Programming, volume 132 of Leibniz International Proceedings in Informatics, pages 33:1-33:14, 2019. URL: https://arxiv.org/abs/1804.01973.
  19. Shantanav Chakraborty, Aditya Morolia, and Anurudh Peduri. Quantum regularized least squares, 2022. URL: https://arxiv.org/abs/2206.13143.
  20. Yuxuan Du, Min-Hsiu Hsieh, Tongliang Liu, Shan You, and Dacheng Tao. Quantum differentially private sparse regression learning, 2020. URL: https://arxiv.org/abs/2007.11921.
  21. Christoph Dürr and Peter Høyer. A quantum algorithm for finding the minimum, 1996. URL: https://arxiv.org/abs/quant-ph/9607014.
  22. Marguerite Frank and Philip Wolfe. An algorithm for quadratic programming. Naval Research Logistics Quarterly, 3(1‐2):95-110, 1956. Google Scholar
  23. Ankit Garg, Robin Kothari, Praneeth Netrapalli, and Suhail Sherif. Near-optimal lower bounds for convex optimization for all orders of smoothness. In Proceedings of 35th Conference on Neural Information Processing Systems, 2021. Google Scholar
  24. Ankit Garg, Robin Kothari, Praneeth Netrapalli, and Suhail Sherif. No quantum speedup over gradient descent for non-smooth convex optimization. In Proceedings of 12th Innovations in Theoretical Computer Science Conference, volume 185 of Leibniz International Proceedings in Informatics, pages 53:1-53:20, 2021. URL: https://arxiv.org/abs/2010.01801.
  25. András Gilyén, Seth Lloyd, and Ewin Tang. Quantum-inspired low-rank stochastic regression with logarithmic dependence on the dimension, 2018. URL: https://arxiv.org/abs/1811.04909.
  26. Sander Gribling and Harold Nieuwboer. Improved quantum lower and upper bounds for matrix scaling. In Proceedings of 39th International Symposium on Theoretical Aspects of Computer Science (STACS 2022), volume 219 of Leibniz International Proceedings in Informatics, pages 35:1-35:23, 2022. URL: https://arxiv.org/abs/2109.15282.
  27. Aram Harrow, Avinatan Hassidim, and Seth Lloyd. Quantum algorithm for solving linear systems of equations. Physical Review Letters, 103(15):150502, 2009. URL: https://arxiv.org/abs/0811.3171.
  28. Elad Hazan and Tomer Koren. Linear regression with limited observation. In Proceedings of the 29th International Conference on Machine Learning, 2012. https://arxiv.org/abs/1206.4678 . More extensive version at URL: https://arxiv.org/abs/1108.4559.
  29. Arthur Hoerl and Robert Kennard. Ridge regression: biased estimation for nonorthogonal problems. Technometrics, 12(1):55-67, 1970. Google Scholar
  30. Adam Izdebski and Ronald de Wolf. Improved quantum boosting, 2020. URL: https://arxiv.org/abs/2009.08360.
  31. Martin Jaggi. Revisiting Frank-Wolfe: Projection-free sparse convex optimization. In Proceedings of the 30th International Conference on Machine Learning, volume 28, pages 427-435, 2013. Google Scholar
  32. Martin Jaggi. An equivalence between the Lasso and Support Vector Machines. In Johan Suykens, Marco Signoretto, and Andreas Argyriou, editors, Regularization, Optimization, Kernels, and Support Vector Machines. CRC Press, 2014. URL: https://arxiv.org/abs/1303.1152.
  33. Iordanis Kerenidis and Anupam Prakash. Quantum recommendation systems. In Proceedings of 8th Innovations in Theoretical Computer Science Conference, volume 67 of Leibniz International Proceedings in Informatics, pages 49:1-49:21, 2017. URL: https://arxiv.org/abs/1603.08675.
  34. Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalkar. Foundations of Machine Learning. Adaptive Computation and Machine Learning series. MIT Press, second edition, 2018. Google Scholar
  35. Ashwin Nayak and Felix Wu. The quantum query complexity of approximating the median and related statistics. In Proceedings of the 31st Annual ACM Symposium on Theory of Computing, pages 384-393. ACM, 1999. URL: https://arxiv.org/abs/quant-ph/9804066.
  36. Yurii Nesterov. A method for solving the convex programming problem with convergence rate 𝒪(1/k²). Proceedings of the USSR Academy of Sciences, 269:543-547, 1983. Google Scholar
  37. Anupam Prakash. Quantum Algorithms for Linear Algebra and Machine Learning. PhD thesis, University of California, Berkeley, 2014. Google Scholar
  38. Patrick Rebentrost, Masoud Mohseni, and Seth Lloyd. Quantum support vector machine for big data classification. Physical Review Letters, 113(13):130503, 2014. URL: https://arxiv.org/abs/1307.0471.
  39. Seyran Saeedi and Tom Arodz. Quantum sparse support vector machines, 2019. URL: https://arxiv.org/abs/1902.01879.
  40. Seyran Saeedi, Aliakbar Panahi, and Tom Arodz. Quantum semi-supervised kernel learning. Quantum Machine Intelligence, 3:24, 2021. Google Scholar
  41. Maria Schuld and Nathan Killoran. Quantum machine learning in feature Hilbert spaces. Physical Review Letters, 122(13):040504, 2019. URL: https://arxiv.org/abs/1803.07128.
  42. Shai Shalev-Shwartz and Shai Ben-David. Understanding Machine Learning - From Theory to Algorithms. Cambridge University Press, 2014. Google Scholar
  43. Robert Tibshirani. Regression shrinkage and selection via the Lasso. Journal of the Royal Statistical Society, 58:267-288, 1996. Google Scholar
  44. Hrishikesh Vinod. A survey of Ridge regression and related techniques for improvements over ordinary least squares. The Review of Economics and Statistics, 60(1):121-131, 1978. Google Scholar
  45. Chenyi Zhang, Jiaqi Leng, and Tongyang Li. Quantum algorithms for escaping from saddle points. Quantum, 5:229, 2021. URL: https://arxiv.org/abs/2007.10253.
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail