A Greedy Algorithm for Subspace Approximation Problem

Author Nguyen Kim Thang



PDF
Thumbnail PDF

File

LIPIcs.SWAT.2018.30.pdf
  • Filesize: 327 kB
  • 7 pages

Document Identifiers

Author Details

Nguyen Kim Thang
  • IBISC, Univ Evry, University Paris Saclay, Evry, France

Cite AsGet BibTex

Nguyen Kim Thang. A Greedy Algorithm for Subspace Approximation Problem. In 16th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 101, pp. 30:1-30:7, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)
https://doi.org/10.4230/LIPIcs.SWAT.2018.30

Abstract

In the subspace approximation problem, given m points in R^{n} and an integer k <= n, the goal is to find a k-dimension subspace of R^{n} that minimizes the l_{p}-norm of the Euclidean distances to the given points. This problem generalizes several subspace approximation problems and has applications from statistics, machine learning, signal processing to biology. Deshpande et al. [Deshpande et al., 2011] gave a randomized O(sqrt{p})-approximation and this bound is proved to be tight assuming NP != P by Guruswami et al. [Guruswami et al., 2016]. It is an intriguing question of determining the performance guarantee of deterministic algorithms for the problem. In this paper, we present a simple deterministic O(sqrt{p})-approximation algorithm with also a simple analysis. That definitely settles the status of the problem in term of approximation up to a constant factor. Besides, the simplicity of the algorithm makes it practically appealing.

Subject Classification

ACM Subject Classification
  • Theory of computation → Approximation algorithms analysis
Keywords
  • Approximation Algorithms
  • Subspace Approximation

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Kenneth L Clarkson. Subgradient and sampling algorithms for 𝓁₁ regression. In Proc. 16th Symposium on Discrete algorithms, pages 257-266, 2005. Google Scholar
  2. Johanne Cohen, Christoph Dürr, and Nguyen Kim Thang. Smooth inequalities and equilibrium inefficiency in scheduling games. In Internet and Network Economics, pages 350-363, 2012. Google Scholar
  3. Anirban Dasgupta, Petros Drineas, Boulos Harb, Ravi Kumar, and Michael W Mahoney. Sampling algorithms and coresets for 𝓁_p regression. SIAM Journal on Computing, 38(5):2060-2078, 2009. Google Scholar
  4. Amit Deshpande, Madhur Tulsiani, and Nisheeth K Vishnoi. Algorithms and hardness for subspace approximation. In Proc. 22nd Symposium on Discrete Algorithms, pages 482-496, 2011. Google Scholar
  5. Petros Drineas, Michael W Mahoney, and S Muthukrishnan. Sampling algorithms for 𝓁₂ regression and applications. In Proc. 17th Symposium on Discrete algorithm, pages 1127-1136, 2006. Google Scholar
  6. Gene H Golub and Charles F Van Loan. Matrix computations. Johns Hopkins University Press, 2012. Google Scholar
  7. Venkatesan Guruswami, Prasad Raghavendra, Rishi Saket, and Yi Wu. Bypassing UGC from some optimal geometric inapproximability results. ACM Transactions on Algorithms, 12(1):6, 2016. Google Scholar
  8. Guy Kindler, Assaf Naor, and Gideon Schechtman. The UGC hardness threshold of the L_p grothendieck problem. Mathematics of Operations Research, 35(2):267-283, 2010. Google Scholar
  9. Arkadi Nemirovski, Cornelis Roos, and Tamás Terlaky. On maximization of quadratic form over intersection of ellipsoids with common center. Mathematical Programming, 86(3):463-473, 1999. Google Scholar
  10. Yuri Nesterov. Global quadratic optimization via conic relaxation. Université Catholique de Louvain. Center for Operations Research and Econometrics [CORE], 1998. Google Scholar
  11. Kasturi Varadarajan, Srinivasan Venkatesh, Yinyu Ye, and Jiawei Zhang. Approximating the radii of point sets. SIAM Journal on Computing, 36(6):1764-1776, 2007. Google Scholar
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail