Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Non-asymptotic Analysis of Generalized Approximate Message Passing Algorithms with Right Rotationally Invariant Designs (2302.00088v1)

Published 31 Jan 2023 in math.ST and stat.TH

Abstract: Approximate Message Passing (AMP) algorithms are a class of iterative procedures for computationally-efficient estimation in high-dimensional inference and estimation tasks. Due to the presence of an 'Onsager' correction term in its iterates, for $N \times M$ design matrices $\mathbf{A}$ with i.i.d. Gaussian entries, the asymptotic distribution of the estimate at any iteration of the algorithm can be exactly characterized in the large system limit as $M/N \rightarrow \delta \in (0, \infty)$ via a scalar recursion referred to as state evolution. In this paper, we show that appropriate functionals of the iterates, in fact, concentrate around their limiting values predicted by these asymptotic distributions with rates exponentially fast in $N$ for a large class of AMP-style algorithms, including those that are used when high-dimensional generalized linear regression models are assumed to be the data-generating process, like the generalized AMP algorithm, or those that are used when the measurement matrix is assumed to be right rotationally invariant instead of i.i.d. Gaussian, like vector AMP and generalized vector AMP. In practice, these more general AMP algorithms have many applications, for example in in communications or imaging, and this work provides the first study of finite sample behavior of such algorithms.

Citations (5)

Summary

We haven't generated a summary for this paper yet.