Dice Question Streamline Icon: https://streamlinehq.com

Broader AMP non-asymptotic conjecture

Ascertain whether approximate message passing (AMP) and related spin glass techniques provide accurate, predictive power well outside their proven proportional-asymptotic regimes by rigorously establishing non-asymptotic validity of AMP-based state-evolution predictions across arbitrary scalings of sample size, dimension, and regularization for high-dimensional learning problems such as LASSO and matrix compressed sensing.

Information Square Streamline Icon: https://streamlinehq.com

Background

The paper relies on AMP and state evolution to analyze scaling laws and spectra for diagonal and quadratic networks. While AMP is rigorously justified in proportional asymptotics (fixed n/d and constant λ), the authors use it heuristically beyond this setting and observe strong empirical agreement. Motivated by these findings and analogous results in ridge regression, they propose a broader conjecture on AMP’s predictive power in non-asymptotic regimes.

Establishing such a result would unify the heuristic success of AMP across a range of architectures and inference tasks, including sparse vector recovery (LASSO) and low-rank matrix estimation (matrix compressed sensing), and would bridge current gaps between asymptotic theory and finite-sample practice.

References

This surprising robustness, already established in ridge regression, suggests a broader conjecture: the AMP framework, and related tools from spin glass theory, may provide predictive power well outside their standard asymptotic assumptions.

Scaling Laws and Spectra of Shallow Neural Networks in the Feature Learning Regime (2509.24882 - Defilippis et al., 29 Sep 2025) in Section Non-asymptotic state evolution