Papers
Topics
Authors
Recent
2000 character limit reached

Sparse Autoregressive Neural Networks for Classical Spin Systems (2402.16579v2)

Published 26 Feb 2024 in cond-mat.stat-mech, cond-mat.dis-nn, math.CO, and physics.comp-ph

Abstract: Efficient sampling and approximation of Boltzmann distributions involving large sets of binary variables, or spins, are pivotal in diverse scientific fields even beyond physics. Recent advances in generative neural networks have significantly impacted this domain. However, these neural networks are often treated as black boxes, with architectures primarily influenced by data-driven problems in computational science. Addressing this gap, we introduce a novel autoregressive neural network architecture named TwoBo, specifically designed for sparse two-body interacting spin systems. We directly incorporate the Boltzmann distribution into its architecture and parameters, resulting in enhanced convergence speed, superior free energy accuracy, and reduced trainable parameters. We perform numerical experiments on disordered, frustrated systems with more than 1000 spins on grids and random graphs, and demonstrate its advantages compared to previous autoregressive and recurrent architectures. Our findings validate a physically informed approach and suggest potential extensions to multivalued variables and many-body interaction systems, paving the way for broader applications in scientific research.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (35)
  1. Statistical mechanics of complex networks. Rev. Mod. Phys., 74(1):47–97.
  2. Emergence of scaling in random networks. Science, 286(5439):509–512.
  3. Autoregressive neural-network wavefunctions for ab initio quantum chemistry. Nature Machine Intelligence, 4(4):351–358.
  4. Hierarchical autoregressive neural networks for statistical systems. Computer Physics Communications, 281:108502.
  5. Biazzo, I. (2023). The autoregressive neural network architecture of the Boltzmann distribution of pairwise interacting spins systems. Communications Physics, 6(1):296.
  6. A bayesian generative neural network framework for epidemic inference problems. Scientific Reports, 12(1):19673.
  7. Language models are few-shot learners. In Advances in Neural Information Processing Systems, volume 33, pages 1877–1901. Curran Associates, Inc.
  8. Rank-two relaxation heuristics for MAX-CUT and other binary quadratic programs. SIAM J. Optim., 12(2):503–521.
  9. Attention-based quantum tomography. Machine Learning: Science and Technology, 3(1):01LT01.
  10. McSparse: Exact solutions of sparse maximum cut and sparse unconstrained binary quadratic optimization problems. In 2022 Proceedings of the Symposium on Algorithm Engineering and Experiments (ALENEX), pages 54–66.
  11. Machine-learning-assisted Monte Carlo fails at sampling computationally hard problems. Machine Learning: Science and Technology, 4(1):010501.
  12. Inverse statistical physics of protein sequences: A key issues review. Reports on Progress in Physics, 81(3):032601.
  13. What works best when? a systematic evaluation of heuristics for Max-Cut and QUBO. INFORMS Journal on Computing, 30(3).
  14. MADE: Masked autoencoder for distribution estimation. In Proceedings of the 32nd International Conference on Machine Learning, volume 37, pages 881–889, Lille, France. PMLR.
  15. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
  16. Recurrent neural network wave functions. Phys. Rev. Res., 2(2):023358.
  17. Variational neural annealing. Nature Machine Intelligence, 3(11):1–10.
  18. Variational neural annealing. Nature Machine Intelligence, 3(11):952–961.
  19. Neural annealing and visualization of autoregressive neural networks in the newman-moore model. Condensed Matter, 7(2).
  20. Supplementing recurrent neural networks with annealing to solve combinatorial optimization problems. Machine Learning: Science and Technology, 4(1):015026.
  21. Adam: A method for stochastic optimization. In 3rd International Conference on Learning Representations.
  22. Solving quantum statistical mechanics with variational autoregressive networks and quantum circuits. Machine Learning: Science and Technology, 2(2):025011.
  23. Autoregressive neural network for simulating open quantum systems via a probabilistic formulation. Phys. Rev. Lett., 128(9):090501.
  24. Boosting Monte Carlo simulations of spin glasses using autoregressive neural networks. Phys. Rev. E, 101(5):053312.
  25. Information, Physics, and Computation. Oxford University Press.
  26. Spin Glass Theory and Beyond.
  27. Analytic and algorithmic solution of random satisfiability problems. Science, 297(5582):812–815.
  28. Asymptotically unbiased estimation of physical observables with neural samplers. Phys. Rev. E, 101(2):023304.
  29. Solving statistical mechanics on sparse graphs with feedback-set variational autoregressive networks. Phys. Rev. E, 103(1):012103.
  30. Deep autoregressive models for the efficient variational simulation of many-body quantum systems. Phys. Rev. Lett., 124(2):020503.
  31. Calculating Rényi entropies with neural autoregressive quantum states. Phys. Rev. A, 102(6):062413.
  32. Unbiased Monte Carlo cluster updates with autoregressive neural networks. Phys. Rev. Res., 3(4):L042024.
  33. From tensor-network quantum states to tensorial recurrent neural networks. Phys. Rev. Res., 5(3):L032001.
  34. Solving statistical mechanics using variational autoregressive networks. Phys. Rev. Lett., 122(8):080602.
  35. Statistical physics of inference: Thresholds and algorithms. Advances in Physics, 65(5):453–552.
Citations (4)

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Video Overview

Open Problems

We found no open problems mentioned in this paper.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 21 likes about this paper.