Compressed Sensing with Adversarial Sparse Noise via L1 Regression
Abstract
We present a simple and effective algorithm for the problem of \emph{sparse robust linear regression}. In this problem, one would like to estimate a sparse vector $w^* \in \mathbb{R}^n$ from linear measurements corrupted by sparse noise that can arbitrarily change an adversarially chosen $\eta$ fraction of measured responses $y$, as well as introduce bounded norm noise to the responses. For Gaussian measurements, we show that a simple algorithm based on L1 regression can successfully estimate $w^*$ for any $\eta < \eta_0 \approx 0.239$, and that this threshold is tight for the algorithm. The number of measurements required by the algorithm is $O(k \log \frac{n}{k})$ for $k$sparse estimation, which is within constant factors of the number needed without any sparse noise. Of the three properties we showthe ability to estimate sparse, as well as dense, $w^*$; the tolerance of a large constant fraction of outliers; and tolerance of adversarial rather than distributional (e.g., Gaussian) dense noiseto the best of our knowledge, no previous result achieved more than two.
 Publication:

arXiv eprints
 Pub Date:
 September 2018
 arXiv:
 arXiv:1809.08055
 Bibcode:
 2018arXiv180908055K
 Keywords:

 Computer Science  Data Structures and Algorithms;
 Computer Science  Machine Learning