Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Near optimal compressed sensing without priors: Parametric SURE Approximate Message Passing (1409.0440v1)

Published 12 Aug 2014 in cs.IT and math.IT

Abstract: Both theoretical analysis and empirical evidence confirm that the approximate message passing (AMP) algorithm can be interpreted as recursively solving a signal denoising problem: at each AMP iteration, one observes a Gaussian noise perturbed original signal. Retrieving the signal amounts to a successive noise cancellation until the noise variance decreases to a satisfactory level. In this paper we incorporate the Stein's unbiased risk estimate (SURE) based parametric denoiser with the AMP framework and propose the novel parametric SURE-AMP algorithm. At each parametric SURE-AMP iteration, the denoiser is adaptively optimized within the parametric class by minimizing SURE, which depends purely on the noisy observation. In this manner, the parametric SURE-AMP is guaranteed with the best-in-class recovery and convergence rate. If the parameter family includes the families of the mimimum mean squared error (MMSE) estimators, we are able to achieve the Bayesian optimal AMP performance without knowing the signal prior. In the paper, we resort to the linear parameterization of the SURE based denoiser and propose three different kernel families as the base functions. Numerical simulations with the Bernoulli-Gaussian, $k$-dense and Student's-t signals demonstrate that the parametric SURE-AMP does not only achieve the state-of-the-art recovery but also runs more than 20 times faster than the EM-GM-GAMP algorithm.

Citations (62)

Summary

We haven't generated a summary for this paper yet.