Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 172 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 34 tok/s Pro
GPT-5 High 40 tok/s Pro
GPT-4o 100 tok/s Pro
Kimi K2 198 tok/s Pro
GPT OSS 120B 436 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Fourier Analysis of Iterative Algorithms (2404.07881v2)

Published 11 Apr 2024 in cs.CC, cs.DS, and math.CO

Abstract: We study a general class of nonlinear iterative algorithms which includes power iteration, belief propagation and approximate message passing, and many forms of gradient descent. When the input is a random matrix with i.i.d. entries, we use Boolean Fourier analysis to analyze these algorithms as low-degree polynomials in the entries of the input matrix. Each symmetrized Fourier character represents all monomials with a certain shape as specified by a small graph, which we call a Fourier diagram. We prove fundamental asymptotic properties of the Fourier diagrams: over the randomness of the input, all diagrams with cycles are negligible; the tree-shaped diagrams form a basis of asymptotically independent Gaussian vectors; and, when restricted to the trees, iterative algorithms exactly follow an idealized Gaussian dynamic. We use this to prove a state evolution formula, giving a "complete" asymptotic description of the algorithm's trajectory. The restriction to tree-shaped monomials mirrors the assumption of the cavity method, a 40-year-old non-rigorous technique in statistical physics which has served as one of the most important techniques in the field. We demonstrate how to implement cavity method derivations by 1) restricting the iteration to its tree approximation, and 2) observing that heuristic cavity method-type arguments hold rigorously on the simplified iteration. Our proofs use combinatorial arguments similar to the trace method from random matrix theory. Finally, we push the diagram analysis to a number of iterations that scales with the dimension $n$ of the input matrix, proving that the tree approximation still holds for a simple variant of power iteration all the way up to $n{\Omega(1)}$ iterations.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (92)
  1. Algorithmic Thresholds in Mean Field Spin Glasses. arXiv preprint arXiv:2009.11481, 2020.
  2. Graph matrices: Norm bounds and applications. arXiv preprint arXiv:1604.03423, 2020.
  3. Optimization of mean-field spin glasses. Annals of Probability, 49(6):2922–2960, 2021.
  4. The Franz-Parisi criterion and computational trade-offs in high dimensional statistics. In Advances in Neural Information Processing Systems, NeurIPS 2021, volume 35, pages 33831–33844, 2022.
  5. Fundamental limits in structured principal component analysis and how to reach them. Proceedings of the National Academy of Sciences, 120(30), 2023.
  6. Hans A. Bethe. Statistical theory of superlattices. Proceedings of the Royal Society of London. Series A-Mathematical and Physical Sciences, 150(871):552–575, 1935.
  7. A Nearly Tight Sum-of-Squares Lower Bound for the Planted Clique Problem. SIAM Journal on Computing, 48(2):687–735, 2019.
  8. Optimal errors and phase transitions in high-dimensional generalized linear models. Proceedings of the National Academy of Sciences, 116(12):5451–5460, 2019.
  9. Universality in polytope phase transitions and message passing algorithms. Annals of Applied Probability, 25(2):753–822, 2015.
  10. The dynamics of message passing on dense graphs, with applications to compressed sensing. IEEE Transactions on Information Theory, 57(2):764–785, 2011.
  11. State evolution for approximate message passing with non-separable functions. Information and Inference: A Journal of the IMA, 9(1):33–79, 2020.
  12. A rigorous proof of the cavity method for counting matchings. In Proceedings of the 44th Annual Allerton Conference on Communication, Control and Computing, 2006.
  13. Erwin Bolthausen. An Iterative Construction of Solutions of the TAP Equations for the Sherrington–Kirkpatrick Model. Communications in Mathematical Physics, 325(1):333–366, 2014.
  14. Charles Bordenave. Lecture notes on random matrix theory, 2019.
  15. The high-dimensional asymptotics of first order methods with random data. arXiv preprint arXiv:2112.07572, 2021.
  16. Information-theoretic thresholds from the cavity method. In Proceedings of the 49th Annual ACM SIGACT Symposium on Theory of Computing, STOC 2017, pages 146–157. ACM, 2017.
  17. Universality of approximate message passing algorithms. Electronic Journal of Probability, 26:1–44, 2021.
  18. Fundamental barriers to high-dimensional regression with convex penalties. Annals of Statistics, 50(1):170–196, 2022.
  19. Spin Glass Theory and Far Beyond: Replica Symmetry Breaking after 40 Years. World Scientific, 2023.
  20. The estimation error of general first order methods. In Conference on Learning Theory, COLT 2020, pages 1078–1141. PMLR, 2020.
  21. A non-asymptotic analysis of generalized approximate message passing algorithms with right rotationally invariant designs. arXiv preprint arXiv:2302.00088, 2023.
  22. On convergence of approximate message passing. In IEEE International Symposium on Information Theory, ISIT 2014, pages 1812–1816. IEEE, 2014.
  23. Diffusions interacting through a random matrix: universality via stochastic Taylor expansion. Probability Theory and Related Fields, 180:1057–1097, 2021.
  24. Universality of approximate message passing with semirandom matrices. Annals of Probability, 51(5):1616–1683, 2023.
  25. Improved sum-of-squares lower bounds for hidden clique and hidden submatrix problems. In Conference on Learning Theory, COLT 2015, pages 523–562. PMLR, 2015.
  26. Message-passing algorithms for compressed sensing. Proceedings of the National Academy of Sciences, 106(45):18914–18919, 2009.
  27. Message passing algorithms for compressed sensing: I. motivation and construction. In IEEE Information Theory Workshop on Information Theory, ITW 2010, pages 1–5. IEEE, 2010.
  28. Rick Durrett. Probability: Theory and Examples. Cambridge University Press, 2019.
  29. Zhou Fan. Approximate message passing algorithms for rotationally invariant matrices. Annals of Statistics, 50(1):197–224, 2022.
  30. Semialgebraic proofs and efficient algorithm design. Foundations and Trends in Theoretical Computer Science, 14(1-2):1–221, 2019.
  31. A Unifying Tutorial on Approximate Message Passing. Foundations and Trends in Machine Learning, 15(4):335–536, 2022.
  32. Marylou Gabrié. Mean-field inference methods for neural networks. Journal of Physics A: Mathematical and Theoretical, 53(22):223002, 2020.
  33. Graph-based approximate message passing iterations. Information and Inference: A Journal of the IMA, 12(4):2562–2628, 2023.
  34. Sum-of-squares lower bounds for Sherrington-Kirkpatrick via planted affine planes. In 61st IEEE Annual Symposium on Foundations of Computer Science, FOCS 2020, pages 954–965. IEEE, 2020.
  35. Rigorous dynamical mean field theory for stochastic gradient descent methods. arXiv preprint arXiv:2210.06591, 2022.
  36. The Power of Sum-of-Squares for Detecting Hidden Structures. In 58th IEEE Annual Symposium on Foundations of Computer Science, FOCS 2017, pages 720–731. IEEE, 2017.
  37. On the Integrality Gap of Degree-4 Sum of Squares for Planted Clique. ACM Transactions on Algorithms, 14(3):1–31, 2018.
  38. Optimization Algorithms for Multi-Species Spherical Spin Glasses. arXiv preprint arXiv:2308.09672, 2023.
  39. Semidefinite programs simulate approximate message passing robustly. arXiv preprint arXiv:2311.09017, 2023.
  40. Svante Janson. Gaussian Hilbert spaces, volume 129 of Cambridge Tracts in Mathematics. Cambridge University Press, Cambridge, 1997.
  41. State evolution for general approximate message passing algorithms, with applications to spatial coupling. Information and Inference: A Journal of the IMA, 2(2):115–144, 2013.
  42. Christopher Jones. Symmetrized Fourier Analysis of Convex Relaxations for Combinatorial Optimization Problems. PhD thesis, The University of Chicago, 2022.
  43. Almost-orthogonal bases for inner product polynomials. In Proceedings of the 13th Conference on Innovations in Theoretical Computer Science, ITCS 2022, volume 215, pages 89:1–89:21, 2022.
  44. Sum-of-Squares Lower Bounds for Sparse Independent Set. In 62nd IEEE Annual Symposium on Foundations of Computer Science, FOCS 2021, pages 406–416. IEEE, 2021.
  45. Sum-of-Squares Lower Bounds for Densest k𝑘kitalic_k-Subgraph. In Proceedings of the 55th Annual ACM Symposium on Theory of Computing, STOC 2023, pages 84–95. ACM, 2023.
  46. Yoshiyuki Kabashima. A CDMA multiuser detection algorithm on the basis of belief propagation. Journal of Physics A: Mathematical and General, 36(43):11111, 2003.
  47. Probabilistic Graphical Models: Principles and Techniques. MIT press, 2009.
  48. Sum-of-squares lower bounds for ultra-sparse random graphs: Independent set and coloring. To appear, 2024.
  49. Notes on computational hardness of hypothesis testing: Predictions using the low-degree likelihood ratio. In ISAAC Congress (International Society for Analysis, its Applications and Computation), pages 1–50. Springer, 2019.
  50. Approximate message passing from random initialization with applications to ℤ2subscriptℤ2\mathbb{Z}_{2}blackboard_Z start_POSTSUBSCRIPT 2 end_POSTSUBSCRIPT-synchronization. Proceedings of the National Academy of Sciences, 120(31):e2302930120, 2023.
  51. High-dimensional Asymptotics of Langevin Dynamics in Spiked Matrix Models. Information and Inference: A Journal of the IMA, 12(4):2720–2752, 2023.
  52. A non-asymptotic framework for approximate message passing in spiked models. arXiV preprint arXiv:2208.03313, 2022.
  53. A non-asymptotic distributional theory of approximate message passing for sparse and robust regression. To appear, 2024.
  54. Swept Approximate Message Passing for Sparse Estimation. In Proceedings of the 32nd International Conference on Machine Learning, ICML 2015, volume 37, pages 1123–1132, 2015.
  55. Information, Physics, and Computation. Oxford University Press, 2009.
  56. Statistical mechanics methods and phase transitions in optimization problems. Theoretical computer science, 265(1-2):3–67, 2001.
  57. Andrea Montanari. Optimization of the Sherrington-Kirkpatrick Hamiltonian. In 60th IEEE Annual Symposium on Foundations of Computer Science, FOCS 2019, pages 1417–1433. IEEE, 2019.
  58. The cavity method at zero temperature. Journal of Statistical Physics, 111:1–34, 2003.
  59. SK Model: The Replica Solution without Replicas. Europhysics Letters, 1(2):77, 1986.
  60. Spin glass theory and beyond: An Introduction to the Replica Method and Its Applications, volume 9. World Scientific, 1987.
  61. Sum-of-squares Lower Bounds for Planted Clique. In Proceedings of the Forty-Seventh Annual ACM on Symposium on Theory of Computing, STOC 2015, pages 87–96, 2015.
  62. Analysis of approximate message passing with a class of non-separable denoisers. In IEEE International Symposium on Information Theory, ISIT 2017, pages 231–235. IEEE, 2017.
  63. Statistical dynamics of classical systems. Physical Review A, 8(1):423, 1973.
  64. Estimation of low-rank matrices via approximate message passing. Annals of Statistics, 49:321–345, 2021.
  65. Approximate message passing with spectral initialization for generalized linear models. Journal of Statistical Mechanics: Theory and Experiment, 2022(11):114003, 2022.
  66. Spectral methods from tensor networks. In Proceedings of the 51st Annual ACM SIGACT Symposium on Theory of Computing, pages 926–937, 2019.
  67. Equivalence of approximate message passing and low-degree polynomials in rank-one matrix estimation. arXiv preprint arXiv:2212.06996, 2022.
  68. Statistically optimal first order algorithms: A proof via orthogonalization. arXiv preprint arXiv:2201.05101, 2022.
  69. Dmitry Panchenko. The Sherrington–Kirkpatrick model. Springer Science & Business Media, 2013.
  70. Giorgio Parisi. Infinite number of order parameters for spin-glasses. Physical Review Letters, 43(23):1754, 1979.
  71. Giorgio Parisi. A sequence of approximated solutions to the SK model for spin glasses. Journal of Physics A: Mathematical and General, 13(4):L115, 1980.
  72. Judea Pearl. Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Morgan Kaufmann, 1988.
  73. Machinery for Proving Sum-of-Squares Lower Bounds on Certification Problems. arXiv preprint arXiv:2011.04253, 2020.
  74. Sub-exponential time Sum-of-Squares lower bounds for Principal Components Analysis. Advances in Neural Information Processing Systems, NeurIPS 2022, 35:35724–35740, 2022.
  75. A statistical model for tensor PCA. In Advances in Neural Information Processing Systems, NIPS 2014, pages 2897–2905, 2014.
  76. On the convergence of approximate message passing with arbitrary matrices. IEEE Transactions on Information Theory, 65(9):5339–5351, 2019.
  77. High dimensional estimation via sum-of-squares proofs. In Proceedings of the International Congress of Mathematicians: Rio de Janeiro 2018, pages 3389–3423. World Scientific, 2018.
  78. Concentration of polynomial random matrices via Efron-Stein inequalities. In Proceedings of the 2023 ACM-SIAM Symposium on Discrete Algorithms, SODA 2023, pages 3614–3653. SIAM, 2023.
  79. Finite sample analysis of approximate message passing algorithms. IEEE Transactions on Information Theory, 64(11):7264–7286, 2018.
  80. A sum-of-squares hierarchy in the absence of pointwise proofs i: Energy certificates. arXiv preprint arXiv:2401.14383, 2024.
  81. A sum-of-squares hierarchy in the absence of pointwise proofs ii: Rounding high-entropy steps. To appear, 2024.
  82. Keigo Takeuchi. Rigorous dynamics of expectation-propagation-based signal recovery from unitarily invariant measurements. IEEE Transactions on Information Theory, 66(1):368–386, 2019.
  83. Keigo Takeuchi. A unified framework of state evolution for message-passing algorithms. In IEEE International Symposium on Information Theory, ISIT 2019, pages 151–155. IEEE, 2019.
  84. Keigo Takeuchi. Bayes-optimal convolutional AMP. IEEE Transactions on Information Theory, 67(7):4405–4428, 2021.
  85. Michel Talagrand. The Parisi formula. Annals of Mathematics, pages 221–263, 2006.
  86. Michel Talagrand. Mean field models for spin glasses: Volume I: Basic examples, volume 54. Springer Science & Business Media, 2010.
  87. Solution of ‘Solvable model of a spin glass’. Philosophical Magazine, 35(3):593–601, 1977.
  88. Adaptive damping and mean removal for the generalized approximate message passing algorithm. In IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2015, pages 2021–2025. IEEE, 2015.
  89. Universality of approximate message passing algorithms and tensor networks. arXiv preprint arXiv:2206.13037, 2022.
  90. Understanding belief propagation and its generalizations. Exploring artificial intelligence in the new millennium, 8(236-239):0018–9448, 2003.
  91. Statistical physics of inference: Thresholds and algorithms. Advances in Physics, 65(5):453–552, 2016.
  92. A Concise Tutorial on Approximate Message Passing. arXiv preprint arXiv:2201.07487, 2022.
Citations (2)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 3 tweets and received 69 likes.

Upgrade to Pro to view all of the tweets about this paper: