Efficient Block Approximate Matrix Multiplication

Authors Chuhan Yang, Christopher Musco



PDF
Thumbnail PDF

File

LIPIcs.ESA.2023.103.pdf
  • Filesize: 3.09 MB
  • 15 pages

Document Identifiers

Author Details

Chuhan Yang
  • Division of Engineering, New York University Abu Dhabi, UAE
  • Tandon School of Engineering, New York University, NY, USA
Christopher Musco
  • Tandon School of Engineering, New York University, NY, USA

Cite As Get BibTex

Chuhan Yang and Christopher Musco. Efficient Block Approximate Matrix Multiplication. In 31st Annual European Symposium on Algorithms (ESA 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 274, pp. 103:1-103:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023) https://doi.org/10.4230/LIPIcs.ESA.2023.103

Abstract

Randomized matrix algorithms have had significant recent impact on numerical linear algebra. One especially powerful class of methods are algorithms for approximate matrix multiplication based on sampling. Such methods typically sample individual matrix rows and columns using carefully chosen importance sampling probabilities. However, due to practical considerations like memory locality and the preservation of matrix structure, it is often preferable to sample contiguous blocks of rows and columns all together. Recently, (Wu, 2018) addressed this setting by developing an approximate matrix multiplication method based on block sampling. However, the method is inefficient, as it requires knowledge of optimal importance sampling probabilities that are expensive to compute.
We address this issue by showing that the method of Wu can be accelerated through the use of a randomized implicit trace estimation method. Doing so allows us to provably reduce the cost of sampling to near-linear in the size of the matrices being multiplied, without impacting the accuracy of the final approximate matrix multiplication. Overall, this yields a fast practical algorithm, which we test on a number of synthetic and real-world data sets. We complement our algorithmic contribution with the first extensive empirical comparison of block algorithms for randomized matrix multiplication. Our method offers a significant runtime advantage over the method of (Wu, 2018) and also outperforms basic uniform sampling of blocks. However, we find another recent method of (Charalambides, 2021) which uses sub-optimal but efficiently computable sampling probabilities often (but not always) offers the best trade-off between speed and accuracy.

Subject Classification

ACM Subject Classification
  • Theory of computation → Sketching and sampling
Keywords
  • Approximate matrix multiplication
  • randomized numerical linear algebra
  • trace estimation

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Haim Avron and Sivan Toledo. Randomized algorithms for estimating the trace of an implicit symmetric positive semi-definite matrix. J. ACM, 58(2), 2011. Google Scholar
  2. Deng Cai, Xuanhui Wang, and Xiaofei He. Probabilistic dyadic data analysis with local and global consistency. In Proceedings of the 26th Annual International Conference on Machine Learning (ICML'09), pages 105-112, 2009. Google Scholar
  3. Neophytos Charalambides, Mert Pilanci, and Alfred O Hero. Approximate weighted CR coded matrix multiplication. In Proceedings of the 2021 International Conference on Acoustics, Speech, and Signal Processing (ICASSP), pages 5095-5099. IEEE, 2021. Google Scholar
  4. Kenneth Clarkson and David Woodruff. Numerical linear algebra in the streaming model. In Proceedings of the 41st Annual ACM Symposium on Theory of Computing (STOC), pages 205-214, 2009. Google Scholar
  5. Kenneth L. Clarkson and David P. Woodruff. Low rank approximation and regression in input sparsity time. In Proceedings of the 45th Annual ACM Symposium on Theory of Computing (STOC), pages 81-90, 2013. Google Scholar
  6. Edith Cohen and David D Lewis. Approximating matrix multiplication for pattern recognition tasks. Journal of Algorithms, 30(2):211-252, 1999. Google Scholar
  7. Michael B. Cohen, Jelani Nelson, and David P. Woodruff. Optimal approximate matrix product in terms of stable rank. In Proceedings of the 43rd International Colloquium on Automata, Languages and Programming (ICALP), 2016. Google Scholar
  8. Alice Cortinovis and Daniel Kressner. On randomized trace estimates for indefinite matrices with an application to determinants. Foundations of Computational Mathematics, 22(3):875-903, 2022. Google Scholar
  9. Amey Desai, Mina Ghashami, and Jeff M Phillips. Improved practical matrix sketching with guarantees. IEEE Transactions on Knowledge and Data Engineering, 28(7):1678-1690, 2016. Google Scholar
  10. Amit Deshpande and Santosh Vempala. Adaptive sampling and fast low-rank matrix approximation. In Proceedings of the 10th International Workshop on Randomization and Computation (RANDOM), pages 292-303, 2006. Google Scholar
  11. Petros Drineas and Ravi Kannan. Fast Monte-Carlo algorithms for approximate matrix multiplication. In Proceedings 42nd IEEE Symposium on Foundations of Computer Science, pages 452-459. IEEE, 2001. Google Scholar
  12. Petros Drineas, Ravi Kannan, and Michael W Mahoney. Fast Monte Carlo algorithms for matrices I: Approximating matrix multiplication. SIAM Journal on Computing, 36(1):132-157, 2006. Google Scholar
  13. Petros Drineas, Ravi Kannan, and Michael W. Mahoney. Fast Monte Carlo algorithms for matrices II: Computing a low-rank approximation to a matrix. SIAM Journal on Computing, 36(1):158-183, 2006. Google Scholar
  14. Petros Drineas and Michael W. Mahoney. RandNLA: Randomized numerical linear algebra. Commun. ACM, 59(6), 2016. Google Scholar
  15. Sylvester Eriksson-Bique, Mary Solbrig, Michael Stefanelli, Sarah Warkentin, Ralph Abbey, and Ilse CF Ipsen. Importance sampling for a Monte Carlo matrix multiplication algorithm, with application to information retrieval. SIAM Journal on Scientific Computing, 33(4):1689-1706, 2011. Google Scholar
  16. Didier Girard. Un algorithme simple et rapide pour la validation croisee géenéralisée sur des problémes de grande taille. Technical report, École nationale supérieure d'informatique et de mathématiques appliquées de Grenoble, 1987. Google Scholar
  17. Nathan Halko, Per-Gunnar Martinsson, and Joel A. Tropp. Finding structure with randomness: Probabilistic algorithms for constructing approximate matrix decompositions. SIAM Review, 53(2):217-288, 2011. Google Scholar
  18. Michael F. Hutchinson. A stochastic estimator of the trace of the influence matrix for Laplacian smoothing splines. Communications in Statistics-Simulation and Computation, 19(2):433-450, 1990. Google Scholar
  19. Tayyebeh Jahani-Nezhad and Mohammad Ali Maddah-Ali. CodedSketch: A coding scheme for distributed computation of approximated matrix multiplication. IEEE Transactions on Information Theory, 67(6):4185-4196, 2021. Google Scholar
  20. Humberto Madrid, Valia Guerra, and Marielba Rojas. Sampling techniques for Monte Carlo matrix multiplication with applications to image processing. In Mexican Conference on Pattern Recognition, pages 45-54. Springer, 2012. Google Scholar
  21. Raphael A Meyer, Cameron Musco, Christopher Musco, and David P Woodruff. Hutch++: Optimal stochastic trace estimation. In Symposium on Simplicity in Algorithms (SOSA), pages 142-155. SIAM, 2021. Google Scholar
  22. Chengmei Niu and Hanyu Li. Optimal sampling algorithms for block matrix multiplication. Journal of Computational and Applied Mathematics, page 115063, 2023. Google Scholar
  23. Christos H Papadimitriou, Prabhakar Raghavan, Hisao Tamaki, and Santosh Vempala. Latent semantic indexing: A probabilistic analysis. Journal of Computer and System Sciences, 61(2):217-235, 2000. Google Scholar
  24. Vladimir Rokhlin and Mark Tygert. A fast randomized algorithm for overdetermined linear least-squares regression. Proceedings of the National Academy of Sciences, 105(36):13212-13217, 2008. Google Scholar
  25. Tamas Sarlos. Improved approximation algorithms for large matrices via random projections. In Proceedings of the Proceedings of the 47th Annual IEEE Symposium on Foundations of Computer Science (FOCS), pages 143-152, 2006. Google Scholar
  26. NYC Taxi and Limousine Commission (TLC). Tlc trip record data. https://www.nyc.gov/site/tlc/about/tlc-trip-record-data.page, 2022.
  27. David P. Woodruff. Sketching as a tool for numerical linear algebra. Foundations and Trends in Theoretical Computer Science, 10(1-2):1-157, 2014. Google Scholar
  28. Yue Wu. A note on random sampling for matrix multiplication. arXiv preprint, 2018. URL: https://arxiv.org/abs/1811.11237.
  29. Yue Wu, Dimitris Kamilis, and Nick Polydorides. A randomised finite element method for elliptic partial differential equations. arXiv preprint, 2019. URL: https://arxiv.org/abs/1903.07696.
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail