Ellipsoid Fitting up to a Constant

Authors Jun-Ting Hsieh, Pravesh K. Kothari, Aaron Potechin, Jeff Xu



PDF
Thumbnail PDF

File

LIPIcs.ICALP.2023.78.pdf
  • Filesize: 0.84 MB
  • 20 pages

Document Identifiers

Author Details

Jun-Ting Hsieh
  • Carnegie Mellon University, Pittsburgh, PA, USA
Pravesh K. Kothari
  • Carnegie Mellon University, Pittsburgh, PA, USA
Aaron Potechin
  • University of Chicago, IL, USA
Jeff Xu
  • Carnegie Mellon University, Pittsburgh, PA, USA

Cite AsGet BibTex

Jun-Ting Hsieh, Pravesh K. Kothari, Aaron Potechin, and Jeff Xu. Ellipsoid Fitting up to a Constant. In 50th International Colloquium on Automata, Languages, and Programming (ICALP 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 261, pp. 78:1-78:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)
https://doi.org/10.4230/LIPIcs.ICALP.2023.78

Abstract

In [Saunderson, 2011; Saunderson et al., 2013], Saunderson, Parrilo, and Willsky asked the following elegant geometric question: what is the largest m = m(d) such that there is an ellipsoid in ℝ^d that passes through v_1, v_2, …, v_m with high probability when the v_is are chosen independently from the standard Gaussian distribution N(0,I_d)? The existence of such an ellipsoid is equivalent to the existence of a positive semidefinite matrix X such that v_i^⊤ X v_i = 1 for every 1 ⩽ i ⩽ m - a natural example of a random semidefinite program. SPW conjectured that m = (1-o(1)) d²/4 with high probability. Very recently, Potechin, Turner, Venkat and Wein [Potechin et al., 2022] and Kane and Diakonikolas [Kane and Diakonikolas, 2022] proved that m ≳ d²/log^O(1) d via a certain natural, explicit construction. In this work, we give a substantially tighter analysis of their construction to prove that m ≳ d²/C for an absolute constant C > 0. This resolves one direction of the SPW conjecture up to a constant. Our analysis proceeds via the method of Graphical Matrix Decomposition that has recently been used to analyze correlated random matrices arising in various areas [Barak et al., 2019; Bafna et al., 2022]. Our key new technical tool is a refined method to prove singular value upper bounds on certain correlated random matrices that are tight up to absolute dimension-independent constants. In contrast, all previous methods that analyze such matrices lose logarithmic factors in the dimension.

Subject Classification

ACM Subject Classification
  • Theory of computation → Semidefinite programming
Keywords
  • Semidefinite programming
  • random matrices
  • average-case complexity

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Kwangjun Ahn, Dhruv Medarametla, and Aaron Potechin. Graph matrices: Norm bounds and applications. arXiv preprint, 2016. URL: https://arxiv.org/abs/1604.03423.
  2. Mitali Bafna, Jun-Ting Hsieh, Pravesh K Kothari, and Jeff Xu. Polynomial-Time Power-Sum Decomposition of Polynomials. In 2022 IEEE 63rd Annual Symposium on Foundations of Computer Science (FOCS), pages 956-967. IEEE, 2022. Google Scholar
  3. Boaz Barak, Samuel Hopkins, Jonathan Kelner, Pravesh K Kothari, Ankur Moitra, and Aaron Potechin. A Nearly Tight Sum-of-Squares Lower Bound for the Planted Clique Problem. SIAM Journal on Computing, 48(2):687-735, 2019. Google Scholar
  4. Mrinalkanti Ghosh, Fernando Granha Jeronimo, Chris Jones, Aaron Potechin, and Goutham Rajendran. Sum-of-squares lower bounds for Sherrington-Kirkpatrick via planted affine planes. In 2020 IEEE 61st Annual Symposium on Foundations of Computer Science (FOCS), pages 954-965. IEEE, 2020. Google Scholar
  5. Christopher Hoffman, Matthew Kahle, and Elliot Paquette. Spectral gaps of random graphs and applications. International Mathematics Research Notices, 2019. Google Scholar
  6. Jun-Ting Hsieh and Pravesh K Kothari. Algorithmic Thresholds for Refuting Random Polynomial Systems. In Proceedings of the 2022 Annual ACM-SIAM Symposium on Discrete Algorithms (SODA), pages 1154-1203. SIAM, 2022. Google Scholar
  7. Chris Jones, Aaron Potechin, Goutham Rajendran, Madhur Tulsiani, and Jeff Xu. Sum-of-squares lower bounds for sparse independent set. In 2021 IEEE 62nd Annual Symposium on Foundations of Computer Science (FOCS), pages 406-416. IEEE, 2022. Google Scholar
  8. Daniel M Kane and Ilias Diakonikolas. A Nearly Tight Bound for Fitting an Ellipsoid to Gaussian Random Points. arXiv preprint, 2022. URL: https://arxiv.org/abs/2212.11221.
  9. Sidhanth Mohanty, Prasad Raghavendra, and Jeff Xu. Lifting sum-of-squares lower bounds: degree-2 to degree-4. In Proceedings of the 52nd Annual ACM SIGACT Symposium on Theory of Computing, pages 840-853, 2020. Google Scholar
  10. Aaron Potechin, Paxton Turner, Prayaag Venkat, and Alexander S Wein. Near-optimal fitting of ellipsoids to random points. arXiv preprint, 2022. URL: https://arxiv.org/abs/2208.09493.
  11. James Saunderson. Subspace identification via convex optimization. PhD thesis, Massachusetts Institute of Technology, 2011. Google Scholar
  12. James Saunderson, Venkat Chandrasekaran, Pablo A Parrilo, and Alan S Willsky. Diagonal and low-rank matrix decompositions, correlation matrices, and ellipsoid fitting. SIAM Journal on Matrix Analysis and Applications, 33(4):1395-1416, 2012. Google Scholar
  13. James Saunderson, Pablo A Parrilo, and Alan S Willsky. Diagonal and low-rank decompositions and fitting ellipsoids to random points. In 52nd IEEE Conference on Decision and Control, pages 6031-6036. IEEE, 2013. Google Scholar
  14. Terence Tao. Topics in random matrix theory, volume 132. American Mathematical Soc., 2012. Google Scholar
  15. Max A Woodbury. Inverting modified matrices. Memorandum Rept. 42, Statistical Research Group, 1950. Google Scholar
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail