Onsager Correction in GOAMP
- Onsager correction in GOAMP is designed to enforce asymptotic Gaussianity and error decoupling, ensuring that each iteration’s error becomes uncorrelated and predictable.
- The framework incorporates explicit Onsager terms derived from normalized Jacobian traces, which cancel out inherent error dependencies during iterative updates.
- Empirical results demonstrate that GOAMP outperforms conventional AMP, achieving near-optimal MSE under challenging conditions such as ill-conditioned and structured sensing matrices.
The Onsager correction in Generalized Orthogonal Approximate Message Passing (GOAMP) constitutes a central component enabling stable, high-dimensional sparse inference from generalized linear measurements, particularly under challenging conditions such as sublinear sparsity and structured, ill-conditioned sensing matrices. GOAMP generalizes AMP and orthogonal AMP (OAMP) to cases where the number of nonzero components grows sublinearly with signal dimension, and achieves asymptotic error decoupling via explicit Onsager terms designed using state-evolution theory for orthogonally invariant matrices (Takeuchi, 3 Dec 2025).
1. Core Principle of Onsager Correction in GOAMP
The Onsager correction in GOAMP is constructed to enforce asymptotic Gaussianity and decorrelation of estimation errors across iterations. In AMP and related algorithms, the iteration involves passing estimates between denoiser and linear modules. Direct iteration often induces correlations between the current error and prior residuals, invalidating state-evolution predictions and causing algorithmic breakdown for non-i.i.d. Gaussian sensing matrices. The Onsager correction explicitly subtracts a weighted prior message, with the weight (the Onsager coefficient) given by the average divergence (Jacobian trace) of the denoiser or linear module’s output with respect to its input. This correction ensures that, in the large-system limit with orthogonally invariant sensing matrices, estimation errors become asymptotically white and Gaussian, making them analyzable and predictable via scalar recursion (Takeuchi, 3 Dec 2025).
2. Structure of GOAMP Iterations and Explicit Onsager Terms
GOAMP is organized into four iterative modules—outer and inner denoising modules corresponding to signal and measurement domains, each followed by an Onsager correction. Let index iterations; index the denoising modules. The updates proceed as follows:
- Outer-A (measurement domain, linear iteration):
- LMMSE update:
- Onsager correction:
Inner-A (signal domain, prior iteration):
- LMMSE update:
- Onsager correction:
Outer-B (measurement nonlinearity domain):
- Scalar denoiser:
- Onsager correction:
Inner-B (signal prior denoiser):
- Scalar denoiser:
- Onsager correction:
Each Onsager term is derived as the normalized divergence (Jacobian trace) of the corresponding module, correcting for first-order correlations and restoring error "decoupling."
3. State Evolution and Theoretical Guarantees
GOAMP’s Onsager correction is tightly connected to its rigorous state-evolution (SE), predicting mean square errors per iteration in the limit with , , and .
The SE recursion involves the following quantities:
Estimation MSEs:
- Onsager coefficients are given as averages of the relevant Jacobian traces, appearing as the normalized expectations (over the inputs) of module output derivatives.
- In the Bayes-optimal setting with linear measurements and constant-magnitude nonzeros, the threshold for error-free recovery is
where is the nonzero magnitude and is the empirical singular value distribution of (Takeuchi, 3 Dec 2025).
A key result is that GOAMP matches the information-theoretic minimum measurement threshold for perfect reconstruction as standard AMP with i.i.d. Gaussian , while maintaining robust convergence under matrix ensembles and conditioning far beyond that regime.
4. Asymptotic Error Decorrelating Mechanism
The Onsager correction enforces that each module’s output error is, in law, asymptotically independent of all previous module outputs, up to Gaussian fluctuations whose variance is accurately tracked by SE. This property is known as error decoupling. Explicitly, the subtraction of the Onsager term cancels leading-order dependencies in the residuals, resulting in the input to each denoiser being an additive white Gaussian noise (AWGN)-corrupted version of the true signal. This guarantees:
- Predictability and analyzability of algorithmic dynamics.
- Correctness of empirical MSE curves as predicted by scalar SE recursion, even when is non-i.i.d. or ill-conditioned.
- Performance equivalence with optimal Bayesian procedures at the MMSE threshold in large dimensions.
5. Empirical Performance and Stability
Numerical simulations demonstrate that GOAMP with Onsager correction:
- Maintains near-optimal MSE up to very large condition numbers of (e.g., for , , , SNR = 40 dB).
- SE accurately predicts breakdown points in .
- Outperforms conventional AMP, GLasso, and even algorithms specifically tailored for non-i.i.d. or quantized measurements, both in linear and 1-bit compressed sensing regimes.
- Guarantees that SE curves, adjusted by Onsager terms, force algorithmic recursion () to admit (0,0) as the unique fixed point when —ensuring convergence to the true signal (Takeuchi, 3 Dec 2025).
6. Comparison with AMP and Onsager-Corrected Neural Architectures
GOAMP’s Onsager correction generalizes the mechanism originally introduced for AMP with i.i.d. Gaussian . In deep learning contexts, architectures such as LAMP unfold AMP iterations as neural network layers with learnable denoisers and Onsager coefficients, still retaining the key principle of decoupling prediction errors via Onsager reaction (Borgerding et al., 2016). Empirical results confirm that these mechanisms significantly outperform non-OAMP variants in convergence speed, robustness, and error decoupling, especially for ill-conditioned or structured .
| Method | Matrix Classes | Onsager Term Type | Error Decoupling |
|---|---|---|---|
| AMP | i.i.d. Gaussian | Scalar bₜ (from count) | Asymptotic |
| GOAMP | Orthogonally invariant | Averaged Jacobian (ξ,η) | Asymptotic |
| LAMP/LISTA | Learned (typ. i.i.d.) | Learnable bₜ, Jacobian trace | Near-i.i.d. |
The extension of Onsager-corrected updates to generic denoisers and broadened matrix ensembles is central to the GOAMP approach.
7. Significance in Sparse Estimation Literature
The systematic design and deployment of Onsager corrections distinguish GOAMP as a theoretically principled and practically robust framework for high-dimensional, sublinear-sparsity inference under broad measurement models. The rigorous alignment of algorithmic iteration with state-evolution scalars—mediated by the Onsager correction—guarantees both algorithmic stability and sharp phase transitions matching information-theoretic limits (Takeuchi, 3 Dec 2025). As a result, GOAMP and its Onsager-corrected variants provide a unified and analyzable basis for modern high-dimensional signal reconstruction with strong robustness to structural deviations in sensing operators.