Document

# On Using Toeplitz and Circulant Matrices for Johnson-Lindenstrauss Transforms

## File

LIPIcs.ISAAC.2017.32.pdf
• Filesize: 0.51 MB
• 12 pages

## Cite As

Casper Benjamin Freksen and Kasper Green Larsen. On Using Toeplitz and Circulant Matrices for Johnson-Lindenstrauss Transforms. In 28th International Symposium on Algorithms and Computation (ISAAC 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 92, pp. 32:1-32:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)
https://doi.org/10.4230/LIPIcs.ISAAC.2017.32

## Abstract

The Johnson-Lindenstrauss lemma is one of the corner stone results in dimensionality reduction. It says that given N, for any set of N, vectors X \subset R^n, there exists a mapping f : X --> R^m such that f(X) preserves all pairwise distances between vectors in X to within(1 ± \eps) if m = O(\eps^{-2} lg N). Much effort has gone into developing fast embedding algorithms, with the Fast Johnson-Lindenstrauss transform of Ailon and Chazelle being one of the most well-known techniques. The current fastest algorithm that yields the optimal m = O(\eps{-2}lg N) dimensions has an embedding time of O(n lg n + \eps^{-2} lg^3 N). An exciting approach towards improving this, due to Hinrichs and Vybíral, is to use a random m times n Toeplitz matrix for the embedding. Using Fast Fourier Transform, the embedding of a vector can then be computed in O(n lg m) time. The big question is of course whether m = O(\eps^{-2} lg N) dimensions suffice for this technique. If so, this would end a decades long quest to obtain faster and faster Johnson-Lindenstrauss transforms. The current best analysis of the embedding of Hinrichs and Vybíral shows that m = O(\eps^{-2} lg^2 N) dimensions suffice. The main result of this paper, is a proof that this analysis unfortunately cannot be tightened any further, i.e., there exists a set of N vectors requiring m = \Omega(\eps^{-2} lg^2 N) for the Toeplitz approach to work.
##### Keywords
• dimensionality reduction
• Johnson-Lindenstrauss
• Toeplitz matrices

## Metrics

• Access Statistics
• Total Accesses (updated on a weekly basis)
0

## References

1. Dimitris Achlioptas. Database-friendly random projections: Johnson-Lindenstrauss with binary coins. Journal of computer and System Sciences, 66(4):671-687, June 2003. URL: http://dx.doi.org/10.1016/S0022-0000(03)00025-4.
2. Nir Ailon and Bernard Chazelle. The fast Johnson-Lindenstrauss transform and approximate nearest neighbors. SIAM Journal on Computing, 39(1):302-322, May 2009. URL: http://dx.doi.org/10.1137/060673096.
3. Nir Ailon and Edo Liberty. Fast dimension reduction using rademacher series on dual BCH codes. Discrete & Computational Geometry, 42(4):615-630, 2009. URL: http://dx.doi.org/10.1007/s00454-008-9110-x.
4. Nir Ailon and Edo Liberty. An almost optimal unrestricted fast Johnson-Lindenstrauss transform. ACM Trans. Algorithms, 9(3):21:1-21:12, June 2013. URL: http://dx.doi.org/10.1145/2483699.2483701.
5. Jeremiah Blocki, Avrim Blum, Anupam Datta, and Or Sheffet. The Johnson-Lindenstrauss transform itself preserves differential privacy. In Proceedings of the 2012 IEEE 53rd Annual Symposium on Foundations of Computer Science, FOCS '12, pages 410-419, Washington, DC, USA, 2012. IEEE Computer Society. URL: http://dx.doi.org/10.1109/FOCS.2012.67.
6. C. Boutsidis, A. Zouzias, M. W. Mahoney, and P. Drineas. Randomized dimensionality reduction for k-means clustering. IEEE Transactions on Information Theory, 61(2):1045-1062, February 2015. URL: http://dx.doi.org/10.1109/TIT.2014.2375327.
7. E. J. Candes, J. Romberg, and T. Tao. Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information. IEEE Transactions on Information Theory, 52(2):489-509, February 2006. URL: http://dx.doi.org/10.1109/TIT.2005.862083.
8. Michael B. Cohen, Sam Elder, Cameron Musco, Christopher Musco, and Madalina Persu. Dimensionality reduction for k-means clustering and low rank approximation. In Proceedings of the Forty-seventh Annual ACM Symposium on Theory of Computing, STOC '15, pages 163-172, New York, NY, USA, 2015. ACM. URL: http://dx.doi.org/10.1145/2746539.2746569.
9. Anirban Dasgupta, Ravi Kumar, and Tamás Sarlos. A sparse Johnson-Lindenstrauss transform. In Proceedings of the Forty-second ACM Symposium on Theory of Computing, STOC '10, pages 341-350, New York, NY, USA, 2010. ACM. URL: http://dx.doi.org/10.1145/1806689.1806737.
10. Sanjoy Dasgupta and Anupam Gupta. An elementary proof of a theorem of Johnson and Lindenstrauss. Random Struct. Algorithms, 22(1):60-65, 2003. URL: http://dx.doi.org/10.1002/rsa.10073.
11. D. L. Donoho. Compressed sensing. IEEE Transactions on Information Theory, 52(4):1289-1306, April 2006. URL: http://dx.doi.org/10.1109/TIT.2006.871582.
12. Casper Benjamin Freksen and Kasper Green Larsen. On using Toeplitz and circulant matrices for Johnson-Lindenstrauss transforms. ArXiv e-prints, 2017. URL: http://arxiv.org/abs/1706.10110.
13. Sariel Har-Peled, Piotr Indyk, and Rajeev Motwani. Approximate nearest neighbor: Towards removing the curse of dimensionality. Theory of Computing, 8(14):321-350, 2012. URL: http://dx.doi.org/10.4086/toc.2012.v008a014.
14. Aicke Hinrichs and Jan Vybíral. Johnson-Lindenstrauss lemma for circulant matrices. Random Structures &Algorithms, 39(3):391-398, 2011. URL: http://dx.doi.org/10.1002/rsa.20360.
15. Piotr Indyk. Algorithmic applications of low-distortion geometric embeddings. In Proceedings of the 42nd IEEE Symposium on Foundations of Computer Science, FOCS '01, pages 10-33, Washington, DC, USA, 2001. IEEE Computer Society. URL: http://dx.doi.org/10.1109/SFCS.2001.959878.
16. William B Johnson and Joram Lindenstrauss. Extensions of lipschitz mappings into a hilbert space. Contemporary mathematics, 26:189-206, 1984. URL: http://dx.doi.org/10.1090/conm/026/737400.
17. Daniel M. Kane and Jelani Nelson. Sparser Johnson-Lindenstrauss transforms. J. ACM, 61(1):4:1-4:23, January 2014. URL: http://dx.doi.org/10.1145/2559902.
18. Felix Krahmer and Rachel Ward. New and improved Johnson-Lindenstrauss embeddings via the Restricted Isometry Property. SIAM J. Math. Anal., 43(3):1269-1281, 2011.
19. Kasper Green Larsen and Jelani Nelson. Optimality of the Johnson-Lindenstrauss lemma. In Proceedings of the 2017 IEEE 58th Annual Symposium on Foundations of Computer Science, FOCS '17, Washington, DC, USA, October 2017. IEEE Computer Society.
20. S. Muthukrishnan. Data Streams: Algorithms and Applications, volume 1(2) of Foundations and Trendstrademark in Theoretical Computer Science. now Publishers Inc., Hanover, MA, USA, January 2005. URL: http://dx.doi.org/10.1561/0400000002.
21. Daniel A. Spielman and Nikhil Srivastava. Graph sparsification by effective resistances. SIAM J. Comput., 40(6):1913-1926, December 2011. URL: http://dx.doi.org/10.1137/080734029.
22. Santosh S. Vempala. The random projection method, volume 65 of DIMACS - Series in Discrete Mathematics and Theoretical Computer Science. American Mathematical Society, Providence, RI, USA, September 2004. URL: http://dx.doi.org/10.1007/978-1-4615-0013-1_16.
23. Ky Vu, Pierre-Louis Poirion, and Leo Liberti. Using the Johnson-Lindenstrauss lemma in linear and integer programming. ArXiv e-prints, July 2015. URL: http://arxiv.org/abs/1507.00990.
24. Jan Vybíral. A variant of the Johnson-Lindenstrauss lemma for circulant matrices. Journal of Functional Analysis, 260(4):1096-1105, 2011. URL: http://dx.doi.org/10.1016/j.jfa.2010.11.014.
25. Kilian Weinberger, Anirban Dasgupta, John Langford, Alex Smola, and Josh Attenberg. Feature hashing for large scale multitask learning. In Proceedings of the 26th Annual International Conference on Machine Learning, ICML '09, pages 1113-1120, New York, NY, USA, 2009. ACM. URL: http://dx.doi.org/10.1145/1553374.1553516.
26. David P. Woodruff. Sketching as a Tool for Numerical Linear Algebra, volume 10(1-2) of Foundations and Trendstrademark in Theoretical Computer Science. now Publishers Inc., Hanover, MA, USA, 2014. URL: http://dx.doi.org/10.1561/0400000060.
X

Feedback for Dagstuhl Publishing