Equivariant Transformer is all you need (2310.13222v1)
Abstract: Machine learning, deep learning, has been accelerating computational physics, which has been used to simulate systems on a lattice. Equivariance is essential to simulate a physical system because it imposes a strong induction bias for the probability distribution described by a machine learning model. This reduces the risk of erroneous extrapolation that deviates from data symmetries and physical laws. However, imposing symmetry on the model sometimes occur a poor acceptance rate in self-learning Monte-Carlo (SLMC). On the other hand, Attention used in Transformers like GPT realizes a large model capacity. We introduce symmetry equivariant attention to SLMC. To evaluate our architecture, we apply it to our proposed new architecture on a spin-fermion model on a two-dimensional lattice. We find that it overcomes poor acceptance rates for linear models and observe the scaling law of the acceptance rate as in the LLMs with Transformers.
- Advances in machine-learning-based sampling motivated by lattice quantum chromodynamics. Nature Rev. Phys., 5(9):526–535, 2023.
- Gauge covariant neural network for 4 dimensional non-abelian gauge theory. March 2021.
- Highly accurate protein structure prediction with alphafold. Nature, 596:583–589, 2021.
- An image is worth 16x16 words: Transformers for image recognition at scale, 2021.
- TorchMD-NET: Equivariant transformers for neural network based molecular potentials. February 2022.
- E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nat. Commun., 13(1):2453, May 2022.
- Masanobu Horie. E(n)-Equivariant Graph Neural Networks Emulating Mesh-Discretized Physics. Doctor of philosophy in engineering, University of Tsukuba, March 2023.
- Self-learning monte carlo with equivariant transformer, 2023.
- Self-learning monte carlo method. Physical Review B, 95(4), jan 2017.
- Physics-informed machine learning. Nature Reviews Physics, 3:422–440, June 2021. Accepted: 31 March 2021, Published: 24 May 2021.
- Efficient langevin simulation of coupled classical fields and fermions. Phys. Rev. B, 88(23):235101, December 2013.
- Self-learning monte carlo method and cumulative update in fermion systems. Phys. Rev. B, 95(24):241104, June 2017.
- Sample generation for the spin-fermion model using neural networks. Phys. Rev. B, 106(20):205112, November 2022.
- Effective Ruderman–Kittel–Kasuya–Yosida-like interaction in diluted double-exchange model: Self-learning monte carlo approach. J. Phys. Soc. Jpn., 90(3):034711, March 2021.
- Self-learning monte carlo with deep neural networks. Phys. Rev. B, 97(20):205140, May 2018.
- Scaling laws for neural language models, 2020.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.