Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Efficient Tuning-Free $\ell_1$-Regression of Nonnegative Compressible Signals (2003.13092v2)

Published 29 Mar 2020 in cs.IT, math.IT, math.OC, and q-bio.QM

Abstract: In compressed sensing the goal is to recover a signal from as few as possible noisy, linear measurements. The general assumption is that the signal has only a few non-zero entries. The recovery can be performed by multiple different decoders, however most of them rely on some tuning. Given an estimate for the noise level a common convex approach to recover the signal is basis pursuit denoising. If the measurement matrix has the robust null space property with respect to the $\ell_2$-norm, basis pursuit denoising obeys stable and robust recovery guarantees. In the case of unknown noise levels, nonnegative least squares recovers non-negative signals if the measurement matrix fulfills an additional property (sometimes called the $M+$-criterion). However, if the measurement matrix is the biadjacency matrix of a random left regular bipartite graph it obeys with a high probability the null space property with respect to the $\ell_1$-norm with optimal parameters. Therefore, we discuss non-negative least absolute deviation (NNLAD). For these measurement matrices, we prove a uniform, stable and robust recovery guarantee without the need for tuning. Such guarantees are important, since binary expander matrices are sparse and thus allow for fast sketching and recovery. We will further present a method to solve the NNLAD numerically and show that this is comparable to state of the art methods. Lastly, we explain how the NNLAD can be used for viral detection in the recent COVID-19 crisis.

Citations (2)

Summary

We haven't generated a summary for this paper yet.