Document

# Decomposing Overcomplete 3rd Order Tensors using Sum-of-Squares Algorithms

## File

LIPIcs.APPROX-RANDOM.2015.829.pdf
• Filesize: 0.5 MB
• 21 pages

## Cite As

Rong Ge and Tengyu Ma. Decomposing Overcomplete 3rd Order Tensors using Sum-of-Squares Algorithms. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2015). Leibniz International Proceedings in Informatics (LIPIcs), Volume 40, pp. 829-849, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)
https://doi.org/10.4230/LIPIcs.APPROX-RANDOM.2015.829

## Abstract

Tensor rank and low-rank tensor decompositions have many applications in learning and complexity theory. Most known algorithms use unfoldings of tensors and can only handle rank up to n^{\lfloor p/2 \rceil} for a p-th order tensor. Previously no efficient algorithm can decompose 3rd order tensors when the rank is super-linear in the dimension. Using ideas from sum-of-squares hierarchy, we give the first quasi-polynomial time algorithm that can decompose a random 3rd order tensor decomposition when the rank is as large as n^{3/2}/poly log n. We also give a polynomial time algorithm for certifying the injective norm of random low rank tensors. Our tensor decomposition algorithm exploits the relationship between injective norm and the tensor components. The proof relies on interesting tools for decoupling random variables to prove better matrix concentration bounds.
##### Keywords
• sum of squares
• overcomplete tensor decomposition

## Metrics

• Access Statistics
• Total Accesses (updated on a weekly basis)
0

## References

1. Boris Alexeev, Michael A Forbes, and Jacob Tsimerman. Tensor rank: Some lower and upper bounds. In Computational Complexity (CCC), 2011 IEEE 26th Annual Conference on, pages 283-291. IEEE, 2011.
2. A. Anandkumar, D. P. Foster, D. Hsu, S. M. Kakade, and Y. K. Liu. Two SVDs Suffice: Spectral Decompositions for Probabilistic Topic Modeling and Latent Dirichlet Allocation. to appear in the special issue of Algorithmica on New Theoretical Challenges in Machine Learning, July 2013.
3. A. Anandkumar, R. Ge, D. Hsu, and S. M. Kakade. A Tensor Spectral Approach to Learning Mixed Membership Community Models. In Conference on Learning Theory (COLT), June 2013.
4. A. Anandkumar, R. Ge, D. Hsu, S. M. Kakade, and M. Telgarsky. Tensor Methods for Learning Latent Variable Models. J. of Machine Learning Research, 15:2773-2832, 2014.
5. A. Anandkumar, D. Hsu, and S. M. Kakade. A Method of Moments for Mixture Models and Hidden Markov Models. In Proc. of Conf. on Learning Theory, June 2012.
6. Anima Anandkumar, Rong Ge, and Majid Janzamin. Guaranteed Non-Orthogonal Tensor Decomposition via Alternating Rank-1 Updates. arXiv preprint arXiv:1402.5180, Feb. 2014.
7. Joseph Anderson, Mikhail Belkin, Navin Goyal, Luis Rademacher, and James Voss. The more, the merrier: the blessing of dimensionality for learning large gaussian mixtures. arXiv preprint arXiv:1311.2891, 2013.
8. Boaz Barak, Fernando G.S.L. Brandao, Aram W. Harrow, Jonathan Kelner, David Steurer, and Yuan Zhou. Hypercontractivity, sum-of-squares proofs, and their applications. In Proceedings of the Forty-fourth Annual ACM Symposium on Theory of Computing, STOC'12, pages 307-326, New York, NY, USA, 2012. ACM.
9. Boaz Barak, Jonathan A. Kelner, and David Steurer. Rounding sum-of-squares relaxations. In STOC, pages 31-40, 2014.
10. Boaz Barak, Jonathan A. Kelner, and David Steurer. Dictionary learning and tensor decomposition via the sum-of-squares method. In Proceedings of the Forty-seventh Annual ACM Symposium on Theory of Computing, STOC'15, 2015.
11. Boaz Barak and Ankur Moitra. Tensor prediction, rademacher complexity and random 3-XOR. http://arxiv.org/abs/1501.06521, 2015.
12. Boaz Barak and David Steurer. Sum-of-squares proofs and the quest toward optimal algorithms. In Proceedings of International Congress of Mathematicians (ICM), 2014. To appear.
13. Aditya Bhaskara, Moses Charikar, Ankur Moitra, and Aravindan Vijayaraghavan. Smoothed analysis of tensor decompositions. In Proceedings of the 46th Annual ACM Symposium on Theory of Computing, pages 594-603. ACM, 2014.
14. Joseph T. Chang. Full reconstruction of Markov models on evolutionary trees: Identifiability and consistency. Mathematical Biosciences, 137:51-73, 1996.
15. Pierre Comon. Tensor: a partial survey. Signal Processing Magazine, page 11, 2014.
16. Lieven De Lathauwer, Joséphine Castaing, and Jean-François Cardoso. Fourth-order cumulant-based blind identification of underdetermined mixtures. Signal Processing, IEEE Transactions on, 55(6):2965-2973, 2007.
17. Ignat Domanov and Lieven De Lathauwer. Canonical polyadic decomposition of third-order tensors: relaxed uniqueness conditions and algebraic algorithm. arXiv preprint arXiv:1501.07251, 2015.
18. Ignat Domanov and Lieven De Lathauwer. Canonical polyadic decomposition of third-order tensors: reduction to generalized eigenvalue decomposition. SIAM Journal on Matrix Analysis and Applications, 35(2):636-660, 2014.
19. Rong Ge, Qingqing Huang, and Sham M. Kakade. Learning mixtures of gaussians in high dimensions. In Proceedings of the Forty-seventh Annual ACM Symposium on Theory of Computing, STOC'15, 2015.
20. Leonid Gurvits. Classical deterministic complexity of edmonds' problem and quantum entanglement. In Proceedings of the Thirty-fifth Annual ACM Symposium on Theory of Computing, STOC'03, pages 10-19, New York, NY, USA, 2003. ACM.
21. Aram W Harrow and Ashley Montanaro. Testing product states, quantum merlin-arthur games and tensor optimization. Journal of the ACM (JACM), 60(1):3, 2013.
22. Johan Håstad. Tensor rank is np-complete. Journal of Algorithms, 11(4):644-654, 1990.
23. Christopher J. Hillar and Lek-Heng Lim. Most tensor problems are NP hard. arXiv preprint arXiv:0911.1393, 2009.
24. J.B. Kruskal. Three-way arrays: Rank and uniqueness of trilinear decompositions, with application to arithmetic complexity and statistics. Linear algebra and its applications, 18(2):95-138, 1977.
25. Jean B Lasserre. Global optimization with polynomials and the problem of moments. SIAM Journal on Optimization, 11(3):796-817, 2001.
26. Elchanan Mossel and Sébastian Roch. Learning nonsingular phylogenies and hidden Markov models. Annals of Applied Probability, 16(2):583-614, 2006.
27. Pablo A Parrilo. Structured semidefinite programs and semialgebraic geometry methods in robustness and optimization. PhD thesis, California Institute of Technology, 2000.
28. Victor H. de la Pena and S. J. Montgomery-Smith. Decoupling inequalities for the tail probabilities of multivariate u-statistics. The Annals of Probability, 23(2):pp. 806-816, 1995.
29. Volker Strassen. Vermeidung von divisionen. Journal für die reine und angewandte Mathematik, 264:184-202, 1973.
30. Joel A Tropp. User-friendly tail bounds for sums of random matrices. Foundations of Computational Mathematics, 12(4):389-434, 2012.
X

Feedback for Dagstuhl Publishing