Abstract
Understanding the singular value spectrum of an n x n matrix A is a fundamental task in countless numerical computation and data analysis applications. In matrix multiplication time, it is possible to perform a full SVD of A and directly compute the singular values \sigma_1,...,\sigma_n. However, little is known about algorithms that break this runtime barrier.
Using tools from stochastic trace estimation, polynomial approximation, and fast linear system solvers, we show how to efficiently isolate different ranges of A's spectrum and approximate the number of singular values in these ranges. We thus effectively compute an approximate histogram of the spectrum, which can stand in for the true singular values in many applications.
We use our histogram primitive to give the first algorithms for approximating a wide class of symmetric matrix norms and spectral sums faster than the best known runtime for matrix multiplication. For example, we show how to obtain a (1 + \epsilon) approximation to the Schatten 1norm (i.e. the nuclear or trace norm) in just ~ O((nnz(A)n^{1/3} + n^2)\epsilon^{3}) time for A with uniform row sparsity or \tilde O(n^{2.18} \epsilon^{3}) time for dense matrices. The runtime scales smoothly for general Schattenp norms, notably becoming \tilde O (p nnz(A) \epsilon^{3}) for any real p >= 2.
At the same time, we show that the complexity of spectrum approximation is inherently tied to fast matrix multiplication in the small \epsilon regime. We use finegrained complexity to give conditional lower bounds for spectrum approximation, showing that achieving milder \epsilon dependencies in our algorithms would imply triangle detection algorithms for general graphs running in faster than state of the art matrix multiplication time. This further implies, through a reduction of (Williams & William, 2010), that highly accurate spectrum approximation algorithms running in subcubic time can be used to give subcubic time matrix multiplication. As an application of our bounds, we show that precisely computing all effective resistances in a graph in less than matrix multiplication time is likely difficult, barring a major algorithmic breakthrough.
BibTeX  Entry
@InProceedings{musco_et_al:LIPIcs:2018:8339,
author = {Cameron Musco and Praneeth Netrapalli and Aaron Sidford and Shashanka Ubaru and David P. Woodruff},
title = {{Spectrum Approximation Beyond Fast Matrix Multiplication: Algorithms and Hardness}},
booktitle = {9th Innovations in Theoretical Computer Science Conference (ITCS 2018)},
pages = {8:18:21},
series = {Leibniz International Proceedings in Informatics (LIPIcs)},
ISBN = {9783959770606},
ISSN = {18688969},
year = {2018},
volume = {94},
editor = {Anna R. Karlin},
publisher = {Schloss DagstuhlLeibnizZentrum fuer Informatik},
address = {Dagstuhl, Germany},
URL = {http://drops.dagstuhl.de/opus/volltexte/2018/8339},
URN = {urn:nbn:de:0030drops83397},
doi = {10.4230/LIPIcs.ITCS.2018.8},
annote = {Keywords: spectrum approximation, matrix norm computation, finegrained complexity, linear algebra}
}
Keywords: 

spectrum approximation, matrix norm computation, finegrained complexity, linear algebra 
Seminar: 

9th Innovations in Theoretical Computer Science Conference (ITCS 2018) 
Issue Date: 

2018 
Date of publication: 

05.01.2018 