Sampling Rates for $\ell^1$Synthesis
Abstract
This work investigates the problem of signal recovery from undersampled noisy subGaussian measurements under the assumption of a synthesisbased sparsity model. Solving the $\ell^1$synthesis basis pursuit allows for a simultaneous estimation of a coefficient representation as well as the soughtfor signal. However, due to linear dependencies within redundant dictionary atoms it might be impossible to identify a specific representation vector, although the actual signal is still successfully recovered. The present manuscript studies both estimation problems from a nonuniform, signaldependent perspective. By utilizing recent results on the convex geometry of linear inverse problems, the sampling rates describing the phase transitions of each formulation are identified. In both cases, they are given by the conic Gaussian mean width of an $\ell^1$descent cone that is linearly transformed by the dictionary. In general, this expression does not allow a simple calculation by following the polaritybased approach commonly found in the literature. Hence, two upper bounds involving the sparsity of coefficient representations are provided: The first one is based on a local condition number and the second one on a geometric analysis that makes use of the thinness of highdimensional polyhedral cones with not too many generators. It is furthermore revealed that both recovery problems can differ dramatically with respect to robustness to measurement noise  a fact that seems to have gone unnoticed in most of the related literature. All insights are carefully undermined by numerical simulations.
 Publication:

arXiv eprints
 Pub Date:
 April 2020
 arXiv:
 arXiv:2004.07175
 Bibcode:
 2020arXiv200407175M
 Keywords:

 Computer Science  Information Theory;
 Mathematics  Numerical Analysis