Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Bilinear Compressed Sensing under known Signs via Convex Programming (1906.11636v2)

Published 25 Jun 2019 in math.OC, cs.IT, and math.IT

Abstract: We consider the bilinear inverse problem of recovering two vectors, $\boldsymbol{x} \in\mathbb{R}L$ and $\boldsymbol{w} \in\mathbb{R}L$, from their entrywise product. We consider the case where $\boldsymbol{x}$ and $\boldsymbol{w}$ have known signs and are sparse with respect to known dictionaries of size $K$ and $N$, respectively. Here, $K$ and $N$ may be larger than, smaller than, or equal to $L$. We introduce $\ell_1$-BranchHull, which is a convex program posed in the natural parameter space and does not require an approximate solution or initialization in order to be stated or solved. Under the assumptions that $\boldsymbol{x}$ and $\boldsymbol{w}$ satisfy a comparable-effective-sparsity condition and are $S_1$- and $S_2$-sparse with respect to a random dictionary, we present a recovery guarantee in a noisy case. We show that $\ell_1$-BranchHull is robust to small dense noise with high probability if the number of measurements satisfy $L\geq\Omega\left((S_1+S_2)\log{2}(K+N)\right)$. Numerical experiments show that the scaling constant in the theorem is not too large. We also introduce variants of $\ell_1$-BranchHull for the purposes of tolerating noise and outliers, and for the purpose of recovering piecewise constant signals. We provide an ADMM implementation of these variants and show they can extract piecewise constant behavior from real images.

Citations (4)

Summary

We haven't generated a summary for this paper yet.