The Product of Gaussian Matrices Is Close to Gaussian

Authors Yi Li , David P. Woodruff

Thumbnail PDF


  • Filesize: 0.73 MB
  • 22 pages

Document Identifiers

Author Details

Yi Li
  • Division of Mathematical Sciences, Nanyang Technological University, Singapore, Singapore
David P. Woodruff
  • Department of Computer Science, Carnegie Mellon University, Pittsburgh, PA, USA


D. Woodruff would like to thank Sébastien Bubeck, Sitan Chen, and Jerry Li for many helpful discussions.

Cite AsGet BibTex

Yi Li and David P. Woodruff. The Product of Gaussian Matrices Is Close to Gaussian. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 207, pp. 35:1-35:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


We study the distribution of the matrix product G₁ G₂ ⋯ G_r of r independent Gaussian matrices of various sizes, where G_i is d_{i-1} × d_i, and we denote p = d₀, q = d_r, and require d₁ = d_{r-1}. Here the entries in each G_i are standard normal random variables with mean 0 and variance 1. Such products arise in the study of wireless communication, dynamical systems, and quantum transport, among other places. We show that, provided each d_i, i = 1, …, r, satisfies d_i ≥ C p ⋅ q, where C ≥ C₀ for a constant C₀ > 0 depending on r, then the matrix product G₁ G₂ ⋯ G_r has variation distance at most δ to a p × q matrix G of i.i.d. standard normal random variables with mean 0 and variance ∏_{i = 1}^{r-1} d_i. Here δ → 0 as C → ∞. Moreover, we show a converse for constant r that if d_i < C' max{p,q}^{1/2}min{p,q}^{3/2} for some i, then this total variation distance is at least δ', for an absolute constant δ' > 0 depending on C' and r. This converse is best possible when p = Θ(q).

Subject Classification

ACM Subject Classification
  • Mathematics of computing → Probability and statistics
  • random matrix theory
  • total variation distance
  • matrix product


  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    PDF Downloads


  1. Gernot Akemann and Jesper R Ipsen. Recent exact and asymptotic results for products of independent random matrices. Acta Physica Polonica B, pages 1747-1784, 2015. Google Scholar
  2. Ahmed El Alaoui and Michael W. Mahoney. Fast randomized kernel ridge regression with statistical guarantees. In Proceedings of the 28th International Conference on Neural Information Processing Systems - Volume 1, NIPS'15, page 775–783, Cambridge, MA, USA, 2015. MIT Press. Google Scholar
  3. Richard Bellman. Limit theorems for non-commutative operations. I. Duke Mathematical Journal, 21(3):491-500, 1954. Google Scholar
  4. Giancarlo Benettin. Power-law behavior of lyapunov exponents in some conservative dynamical systems. Physica D: Nonlinear Phenomena, 13(1-2):211-220, 1984. Google Scholar
  5. Stéphane Boucheron, Gábor Lugosi, and Pascal Massart. Concentration Inequalities: A Nonasymptotic Theory of Independence. Oxford University Press, 2013. Google Scholar
  6. Matthew Brennan, Guy Bresler, and Dheeraj Nagaraj. Phase transitions for detecting latent geometry in random graphs. Probability Theory and Related Fields, pages 1215-1289, 2020. URL:
  7. Sébastien Bubeck, Jian Ding, Ronen Eldan, and Miklós Z. Rácz. Testing for high-dimensional geometry in random graphs. Random Structures & Algorithms, 49(3):503-532, 2016. URL:
  8. Sébastien Bubeck and Shirshendu Ganguly. Entropic CLT and Phase Transition in High-dimensional Wishart Matrices. International Mathematics Research Notices, 2018(2):588-606, December 2016. URL:
  9. Z. Burda, R. A. Janik, and B. Waclaw. Spectrum of the product of independent random gaussian matrices. Physical Review E, 81(4), April 2010. URL:
  10. Emmanuel J Candes. The restricted isometry property and its implications for compressed sensing. Comptes rendus mathematique, 346(9-10):589-592, 2008. Google Scholar
  11. Andrea Crisanti, Giovanni Paladin, and Angelo Vulpiani. Products of random matrices: in Statistical Physics, volume 104. Springer Science & Business Media, 2012. Google Scholar
  12. David L. Donoho. Compressed sensing. IEEE Transactions on Information Theory, 52(4):1289-1306, 2006. Google Scholar
  13. S. Iida, H.A. Weidenmüller, and J.A. Zuk. Statistical scattering theory, the supersymmetry method and universal conductance fluctuations. Annals of Physics, 200(2):219-270, 1990. Google Scholar
  14. Jesper R. Ipsen. Products of independent Gaussian random matrices. PhD thesis, Bielefeld University, 2015. Google Scholar
  15. Hiroshi Ishitani. A central limit theorem for the subadditive process and its application to products of random matrices. Publications of the Research Institute for Mathematical Sciences, 12(3):565-575, 1977. Google Scholar
  16. Tiefeng Jiang. How many entries of a typical orthogonal matrix can be approximated by independent normals? The Annals of Probability, 34(4):1497-1529, July 2006. URL:
  17. Tiefeng Jiang and Yutao Ma. Distances between random orthogonal matrices and independent normals. Transactions of the American Mathematical Society, 372(3):1509-1553, 2019. URL:
  18. Ravindran Kannan and Santosh Vempala. Spectral algorithms. Now Publishers Inc, 2009. Google Scholar
  19. Michael Kapralov, Vamsi Potluru, and David Woodruff. How to fake multiply by a gaussian matrix. In International Conference on Machine Learning, pages 2101-2110. PMLR, 2016. Google Scholar
  20. Michael W. Mahoney. Randomized algorithms for matrices and data. Found. Trends Mach. Learn., 3(2):123–224, 2011. URL:
  21. Satya N Majumdar and Grégory Schehr. Top eigenvalue of a random matrix: large deviations and third order phase transition. Journal of Statistical Mechanics: Theory and Experiment, 2014(1):P01012, 2014. Google Scholar
  22. Robert M May. Will a large complex system be stable? Nature, 238(5364):413-414, 1972. Google Scholar
  23. P.A. Mello, P. Pereyra, and N. Kumar. Macroscopic approach to multichannel disordered conductors. Annals of Physics, 181(2):290-317, 1988. Google Scholar
  24. Ralf R Muller. On the asymptotic eigenvalue distribution of concatenated vector-valued fading channels. IEEE Transactions on Information Theory, 48(7):2086-2091, 2002. Google Scholar
  25. Shanmugavelayutham Muthukrishnan. Data streams: Algorithms and applications. Now Publishers Inc, 2005. Google Scholar
  26. Guillaume Obozinski, Martin J Wainwright, Michael I Jordan, et al. Support union recovery in high-dimensional multivariate regression. The Annals of Statistics, 39(1):1-47, 2011. Google Scholar
  27. James C Osborn. Universal results from an alternate random-matrix model for QCD with a baryon chemical potential. Physical review letters, 93(22):222001, 2004. Google Scholar
  28. G Paladin and A Vulpiani. Scaling law and asymptotic distribution of lyapunov exponents in conservative dynamical systems with many degrees of freedom. Journal of Physics A: Mathematical and General, 19(10):1881, 1986. Google Scholar
  29. Saurabh Paul, Christos Boutsidis, Malik Magdon-Ismail, and Petros Drineas. Random projections for linear support vector machines. ACM Transactions on Knowledge Discovery from Data (TKDD), 8(4):1-25, 2014. Google Scholar
  30. Miklós Z. Rácz and Jacob Richey. A smooth transition from wishart to goe. Journal of Theoretical Probability, pages 898-906, 2019. URL:
  31. Garvesh Raskutti and Michael W Mahoney. A statistical perspective on randomized sketching for ordinary least-squares. The Journal of Machine Learning Research, 17(1):7508-7538, 2016. Google Scholar
  32. Tamas Sarlos. Improved approximation algorithms for large matrices via random projections. In 2006 47th Annual IEEE Symposium on Foundations of Computer Science (FOCS'06), pages 143-152. IEEE, 2006. Google Scholar
  33. Khalid Shebrawi and Hussein Albadawi. Trace inequalities for matrices. Bulletin of the Australian Mathematical Society, 87(1):139–148, 2013. URL:
  34. Terence Tao. Topics in random matrix theory, volume 132. American Mathematical Soc., 2012. Google Scholar
  35. Roman Vershynin. Introduction to the non-asymptotic analysis of random matrices. In Yonina C. Eldar and Gitta Kutyniok, editors, Compressed Sensing: Theory and Applications, pages 210-268. Cambridge University Press, 2012. URL:
  36. David P. Woodruff. Sketching as a tool for numerical linear algebra. Found. Trends Theor. Comput. Sci., 10(1–2):1–157, 2014. URL:
  37. Fan Yang, Sifan Liu, Edgar Dobriban, and David P Woodruff. How to reduce dimension with PCA and random projections? arXiv:2005.00511 [math.ST], 2020. Google Scholar