We present a simple and effective algorithm for the problem of sparse robust linear regression. In this problem, one would like to estimate a sparse vector w^* in R^n from linear measurements corrupted by sparse noise that can arbitrarily change an adversarially chosen eta fraction of measured responses y, as well as introduce bounded norm noise to the responses. For Gaussian measurements, we show that a simple algorithm based on L1 regression can successfully estimate w^* for any eta < eta_0 ~~ 0.239, and that this threshold is tight for the algorithm. The number of measurements required by the algorithm is O(k log n/k) for k-sparse estimation, which is within constant factors of the number needed without any sparse noise. Of the three properties we show - the ability to estimate sparse, as well as dense, w^*; the tolerance of a large constant fraction of outliers; and tolerance of adversarial rather than distributional (e.g., Gaussian) dense noise - to the best of our knowledge, no previous polynomial time algorithm was known to achieve more than two.
@InProceedings{karmalkar_et_al:OASIcs.SOSA.2019.19, author = {Karmalkar, Sushrut and Price, Eric}, title = {{Compressed Sensing with Adversarial Sparse Noise via L1 Regression}}, booktitle = {2nd Symposium on Simplicity in Algorithms (SOSA 2019)}, pages = {19:1--19:19}, series = {Open Access Series in Informatics (OASIcs)}, ISBN = {978-3-95977-099-6}, ISSN = {2190-6807}, year = {2019}, volume = {69}, editor = {Fineman, Jeremy T. and Mitzenmacher, Michael}, publisher = {Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik}, address = {Dagstuhl, Germany}, URL = {https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SOSA.2019.19}, URN = {urn:nbn:de:0030-drops-100455}, doi = {10.4230/OASIcs.SOSA.2019.19}, annote = {Keywords: Robust Regression, Compressed Sensing} }
Feedback for Dagstuhl Publishing