Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
166 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Simple Derivation of AMP and its State Evolution via First-Order Cancellation (1907.04235v3)

Published 9 Jul 2019 in cs.IT and math.IT

Abstract: We consider the linear regression problem, where the goal is to recover the vector $\boldsymbol{x}\in\mathbb{R}n$ from measurements $\boldsymbol{y}=\boldsymbol{A}\boldsymbol{x}+\boldsymbol{w}\in\mathbb{R}m$ under known matrix $\boldsymbol{A}$ and unknown noise $\boldsymbol{w}$. For large i.i.d. sub-Gaussian $\boldsymbol{A}$, the approximate message passing (AMP) algorithm is precisely analyzable through a state-evolution (SE) formalism, which furthermore shows that AMP is Bayes optimal in certain regimes. The rigorous SE proof, however, is long and complicated. And, although the AMP algorithm can be derived as an approximation of loop belief propagation (LBP), this viewpoint provides little insight into why large i.i.d. $\boldsymbol{A}$ matrices are important for AMP, and why AMP has a state evolution. In this work, we provide a heuristic derivation of AMP and its state evolution, based on the idea of "first-order cancellation," that provides insights missing from the LBP derivation while being much shorter than the rigorous SE proof.

Citations (8)

Summary

We haven't generated a summary for this paper yet.