Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 92 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 32 tok/s
GPT-5 High 40 tok/s Pro
GPT-4o 83 tok/s
GPT OSS 120B 467 tok/s Pro
Kimi K2 197 tok/s Pro
2000 character limit reached

Bayes-Optimal Convolutional AMP (2003.12245v4)

Published 27 Mar 2020 in cs.IT and math.IT

Abstract: This paper proposes Bayes-optimal convolutional approximate message-passing (CAMP) for signal recovery in compressed sensing. CAMP uses the same low-complexity matched filter (MF) for interference suppression as approximate message-passing (AMP). To improve the convergence property of AMP for ill-conditioned sensing matrices, the so-called Onsager correction term in AMP is replaced by a convolution of all preceding messages. The tap coefficients in the convolution are determined so as to realize asymptotic Gaussianity of estimation errors via state evolution (SE) under the assumption of orthogonally invariant sensing matrices. An SE equation is derived to optimize the sequence of denoisers in CAMP. The optimized CAMP is proved to be Bayes-optimal for all orthogonally invariant sensing matrices if the SE equation converges to a fixed-point and if the fixed-point is unique. For sensing matrices with low-to-moderate condition numbers, CAMP can achieve the same performance as high-complexity orthogonal/vector AMP that requires the linear minimum mean-square error (LMMSE) filter instead of the MF.

Citations (34)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

Authors (1)