Compressed Sensing with Adversarial Sparse Noise via L1 Regression

Authors Sushrut Karmalkar, Eric Price



PDF
Thumbnail PDF

File

OASIcs.SOSA.2019.19.pdf
  • Filesize: 0.5 MB
  • 19 pages

Document Identifiers

Author Details

Sushrut Karmalkar
Eric Price

Cite As Get BibTex

Sushrut Karmalkar and Eric Price. Compressed Sensing with Adversarial Sparse Noise via L1 Regression. In 2nd Symposium on Simplicity in Algorithms (SOSA 2019). Open Access Series in Informatics (OASIcs), Volume 69, pp. 19:1-19:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019) https://doi.org/10.4230/OASIcs.SOSA.2019.19

Abstract

We present a simple and effective algorithm for the problem of sparse robust linear regression. In this problem, one would like to estimate a sparse vector w^* in R^n from linear measurements corrupted by sparse noise that can arbitrarily change an adversarially chosen eta fraction of measured responses y, as well as introduce bounded norm noise to the responses.
For Gaussian measurements, we show that a simple algorithm based on L1 regression can successfully estimate w^* for any eta < eta_0 ~~ 0.239, and that this threshold is tight for the algorithm. The number of measurements required by the algorithm is O(k log n/k) for k-sparse estimation, which is within constant factors of the number needed without any sparse noise.
Of the three properties we show - the ability to estimate sparse, as well as dense, w^*; the tolerance of a large constant fraction of outliers; and tolerance of adversarial rather than distributional (e.g., Gaussian) dense noise - to the best of our knowledge, no previous polynomial time algorithm was known to achieve more than two.

Subject Classification

Keywords
  • Robust Regression
  • Compressed Sensing

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Kush Bhatia, Prateek Jain, Parameswaran Kamalaruban, and Purushottam Kar. Consistent Robust Regression. In I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, editors, Advances in Neural Information Processing Systems 30, pages 2110-2119. Curran Associates, Inc., 2017. URL: http://papers.nips.cc/paper/6806-consistent-robust-regression.pdf.
  2. Kush Bhatia, Prateek Jain, and Purushottam Kar. Robust regression via hard thresholding. In Advances in Neural Information Processing Systems, pages 721-729, 2015. Google Scholar
  3. P. Bloomfield and W. Steiger. Least Absolute Deviations Curve-Fitting. SIAM Journal on Scientific and Statistical Computing, 1(2):290-301, 1980. URL: http://dx.doi.org/10.1137/0901019.
  4. E. J. Candès, J. Romberg, and T. Tao. Stable signal recovery from incomplete and inaccurate measurements. Comm. Pure Appl. Math., 59(8):1208-1223, 2006. Google Scholar
  5. I. Diakonikolas, W. Kong, and A. Stewart. Efficient Algorithms and Lower Bounds for Robust Linear Regression. ArXiv e-prints, May 2018. URL: http://arxiv.org/abs/1806.00040.
  6. Ilias Diakonikolas, Gautam Kamath, Daniel M. Kane, Jerry Li, Jacob Steinhardt, and Alistair Stewart. Sever: A Robust Meta-Algorithm for Stochastic Optimization. CoRR, abs/1803.02815, 2018. URL: http://arxiv.org/abs/1803.02815.
  7. Cynthia Dwork, Frank McSherry, and Kunal Talwar. The price of privacy and the limits of LP decoding. In Proceedings of the thirty-ninth annual ACM symposium on Theory of computing, pages 85-94. ACM, 2007. Google Scholar
  8. Rina Foygel and Lester Mackey. Corrupted sensing: Novel guarantees for separating structured signals. IEEE Transactions on Information Theory, 60(2):1223-1247, 2014. Google Scholar
  9. Peter J Huber. Robust statistics. In International Encyclopedia of Statistical Science, pages 1248-1251. Springer, 2011. Google Scholar
  10. Adam R. Klivans, Pravesh K. Kothari, and Raghu Meka. Efficient Algorithms for Outlier-Robust Regression. In Conference On Learning Theory, COLT 2018, Stockholm, Sweden, 6-9 July 2018., pages 1420-1430, 2018. URL: http://proceedings.mlr.press/v75/klivans18a.html.
  11. Jason N Laska, Mark A Davenport, and Richard G Baraniuk. Exact signal recovery from sparsely corrupted measurements through the pursuit of justice. In Signals, Systems and Computers, 2009 Conference Record of the Forty-Third Asilomar Conference on, pages 1556-1560. IEEE, 2009. Google Scholar
  12. Xiaodong Li. Compressed sensing and matrix completion with constant proportion of corruptions. Constructive Approximation, 37(1):73-99, 2013. Google Scholar
  13. Liu Liu, Yanyao Shen, Tianyang Li, and Constantine Caramanis. High Dimensional Robust Sparse Regression. arXiv preprint arXiv:1805.11643, 2018. Google Scholar
  14. Nasser M Nasrabadi, Trac D Tran, and Nam Nguyen. Robust lasso with missing and grossly corrupted observations. In Advances in Neural Information Processing Systems, pages 1881-1889, 2011. Google Scholar
  15. Nam H Nguyen and Trac D Tran. Exact Recoverability From Dense Corrupted Observations via L1-Minimization. IEEE transactions on information theory, 59(4):2017-2035, 2013. Google Scholar
  16. Hansheng Wang, Guodong Li, and Guohua Jiang. Robust Regression Shrinkage and Consistent Variable Selection through the LAD-Lasso. Journal of Business &Economic Statistics, 25(3):347-355, 2007. URL: http://www.jstor.org/stable/27638939.
  17. Meng Wang, Weiyu Xu, and Ao Tang. The Limits of Error Correction with lp Decoding. CoRR, abs/1006.0277, 2010. Google Scholar
  18. Chun Yu and Weixin Yao. Robust linear regression: A review and comparison. Communications in Statistics-Simulation and Computation, 46(8):6261-6282, 2017. Google Scholar
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail