Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning Lattice Quantum Field Theories with Equivariant Continuous Flows (2207.00283v3)

Published 1 Jul 2022 in hep-lat, cond-mat.stat-mech, cs.LG, and hep-th

Abstract: We propose a novel machine learning method for sampling from the high-dimensional probability distributions of Lattice Field Theories, which is based on a single neural ODE layer and incorporates the full symmetries of the problem. We test our model on the $\phi4$ theory, showing that it systematically outperforms previously proposed flow-based methods in sampling efficiency, and the improvement is especially pronounced for larger lattices. Furthermore, we demonstrate that our model can learn a continuous family of theories at once, and the results of learning can be transferred to larger lattices. Such generalizations further accentuate the advantages of machine learning methods.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (38)
  1. K. Binder and D. W. Heermann, Monte Carlo Simulation in Statistical Physics: An Introduction, Springer, ISBN 978-3-030-10758-1 (2019).
  2. A. Sokal, Monte Carlo Methods in Statistical Mechanics: Foundations and New Algorithms, pp. 131–192, Springer US, Boston, MA, ISBN 978-1-4899-0319-8, 10.1007/978-1-4899-0319-8_6 (1997).
  3. A. Tomiya and Y. Nagai, Gauge covariant neural network for 4 dimensional non-abelian gauge theory (2021), 2103.11965.
  4. S. J. Wetzel and M. Scherzer, Machine learning of explicit order parameters: From the ising model to su(2) lattice gauge theory, Phys. Rev. B 96, 184410 (2017), 10.1103/PhysRevB.96.184410.
  5. Towards novel insights in lattice field theory with explainable machine learning, Phys. Rev. D 101, 094507 (2020), 10.1103/PhysRevD.101.094507.
  6. Finding the deconfinement temperature in lattice yang-mills theories from outside the scaling window with machine learning, Phys. Rev. D 103, 014509 (2021), 10.1103/PhysRevD.103.014509.
  7. B. Yoon, T. Bhattacharya and R. Gupta, Machine learning estimators for lattice qcd observables, Phys. Rev. D 100, 014504 (2019), 10.1103/PhysRevD.100.014504.
  8. Machine-learning prediction for quasiparton distribution function matrix elements, Phys. Rev. D 101, 034516 (2020), 10.1103/PhysRevD.101.034516.
  9. T. Matsumoto, M. Kitazawa and Y. Kohno, Classifying topological charge in SU(3) Yang–Mills theory with machine learning, Progress of Theoretical and Experimental Physics 2021(2) (2020), 10.1093/ptep/ptaa138, 023D01.
  10. A. Tanaka and A. Tomiya, Towards reduction of autocorrelation in hmc by machine learning, 10.48550/ARXIV.1712.03893 (2017).
  11. Regressive and generative neural networks for scalar field theory, Phys. Rev. D 100, 011501 (2019), 10.1103/PhysRevD.100.011501.
  12. Reducing Autocorrelation Times in Lattice Simulations with Generative Adversarial Networks, Mach. Learn. Sci. Tech. 1, 045011 (2020), 10.1088/2632-2153/abae73, 1811.03533.
  13. Y. Nagai, A. Tanaka and A. Tomiya, Self-learning Monte Carlo for non-Abelian gauge theory with dynamical fermions, Phys. Rev. D 107(5), 054501 (2023), 10.1103/PhysRevD.107.054501, 2010.11900.
  14. M. Albergo, G. Kanwar and P. Shanahan, Flow-based generative models for markov chain monte carlo in lattice field theory, Physical Review D 100(3), 034515 (2019).
  15. Normalizing flows on tori and spheres, In H. D. III and A. Singh, eds., Proceedings of the 37th International Conference on Machine Learning, vol. 119 of Proceedings of Machine Learning Research, pp. 8083–8092. PMLR (2020).
  16. Equivariant flow-based sampling for lattice gauge theory, Phys. Rev. Lett. 125, 121601 (2020), 10.1103/PhysRevLett.125.121601.
  17. Sampling using SU⁢(n)normal-SU𝑛\mathrm{SU}(n)roman_SU ( italic_n ) gauge equivariant flows, Phys. Rev. D 103, 074504 (2021), 10.1103/PhysRevD.103.074504.
  18. Introduction to normalizing flows for lattice field theory (2021), 2101.08176.
  19. Path-gradient estimators for continuous normalizing flows, In International Conference on Machine Learning, pp. 21945–21959. PMLR (2022).
  20. R. Abbott et al., Sampling QCD field configurations with gauge-equivariant flow models, PoS LATTICE2022, 036 (2023), 10.22323/1.430.0036, 2208.03832.
  21. Estimation of Thermodynamic Observables in Lattice Field Theories with Deep Generative Models, Phys. Rev. Lett. 126(3), 032001 (2021), 10.1103/PhysRevLett.126.032001, 2007.07115.
  22. Asymptotically unbiased estimation of physical observables with neural samplers, Phys. Rev. E 101(2), 023304 (2020), 10.1103/PhysRevE.101.023304, 1910.13496.
  23. L. Dinh, J. Sohl-Dickstein and S. Bengio, Density estimation using real nvp (2017), 1605.08803.
  24. M. Lüscher, Trivializing maps, the Wilson flow and the HMC algorithm, Commun. Math. Phys. 293, 899 (2010), 10.1007/s00220-009-0953-7, 0907.5491.
  25. Machine Learning of Thermodynamic Observables in the Presence of Mode Collapse, PoS LATTICE2021, 338 (2022), 10.22323/1.396.0338, 2111.11303.
  26. Efficient modeling of trivializing maps for lattice ϕitalic-ϕ\phiitalic_ϕ4 theory using normalizing flows: A first look at scalability, Phys. Rev. D 104(9), 094507 (2021), 10.1103/PhysRevD.104.094507, 2105.12481.
  27. Flow-based sampling for multimodal distributions in lattice field theory (2021), 2107.00734.
  28. G. Mussardo, Statistical Field Theory: An Introduction to Exactly Solved Models in Statistical Physics, Oxford Graduate Texts. OUP Oxford, ISBN 9780199547586 (2010).
  29. I. Montvay and G. Münster, Quantum fields on a lattice, Cambridge University Press (1997).
  30. Boltzmann generators: Sampling equilibrium states of many-body systems with deep learning, Science 365(6457) (2019).
  31. Neural ordinary differential equations, In Advances in neural information processing systems, pp. 6571–6583 (2018).
  32. J. Köhler, L. Klein and F. Noé, Equivariant flows: Exact likelihood generative learning for symmetric densities, In H. D. III and A. Singh, eds., Proceedings of the 37th International Conference on Machine Learning, vol. 119 of Proceedings of Machine Learning Research, pp. 5361–5370. PMLR (2020).
  33. Fourier features let networks learn high frequency functions in low dimensional domains, CoRR abs/2006.10739 (2020), 2006.10739.
  34. I. Vierhaus, Simulation of φ4superscript𝜑4\varphi^{4}italic_φ start_POSTSUPERSCRIPT 4 end_POSTSUPERSCRIPT theory in the strong coupling expansion beyond the Ising limit, Ph.D. thesis, Humboldt University (2010).
  35. Curriculum learning, In Proceedings of the 26th annual international conference on machine learning, pp. 41–48 (2009).
  36. Scaling theory for finite-size effects in the critical region, Phys. Rev. Lett. 28, 1516 (1972), 10.1103/PhysRevLett.28.1516.
  37. Learning trivializing gradient flows for lattice gauge theories, Physical Review D 107(5) (2023), 10.1103/physrevd.107.l051504.
  38. A. W. Sandvik, Computational Studies of Quantum Spin Systems, AIP Conference Proceedings 1297(1), 135 (2010), 10.1063/1.3518900.
Citations (32)

Summary

We haven't generated a summary for this paper yet.

Github Logo Streamline Icon: https://streamlinehq.com