Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
175 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On the Convergence of Orthogonal/Vector AMP: Long-Memory Message-Passing Strategy (2111.05522v2)

Published 10 Nov 2021 in cs.IT and math.IT

Abstract: Orthogonal/vector approximate message-passing (AMP) is a powerful message-passing (MP) algorithm for signal reconstruction in compressed sensing. This paper proves the convergence of Bayes-optimal orthogonal/vector AMP in the large system limit. The proof strategy is based on a novel long-memory (LM) MP approach: A first step is a construction of LM-MP that is guaranteed to converge systematically. A second step is a large-system analysis of LM-MP via an existing framework of state evolution. A third step is to prove the convergence of state evolution recursions for Bayes-optimal LM-MP via a new statistical interpretation of existing LM damping. The last is an exact reduction of the state evolution recursions for Bayes-optimal LM-MP to those for Bayes-optimal orthogonal/vector AMP. The convergence of the state evolution recursions for Bayes-optimal LM-MP implies that for Bayes-optimal orthogonal/vector AMP. Numerical simulations are presented to show the verification of state evolution results for damped orthogonal/vector AMP and a negative aspect of LM-MP in finite-sized systems.

Citations (22)

Summary

We haven't generated a summary for this paper yet.