Hot-Ham: an accurate and efficient E(3)-equivariant machine-learning electronic structures calculation framework (2509.04875v1)
Abstract: The combinations of machine learning with ab initio methods have attracted much attention for their potential to resolve the accuracy-efficiency dilemma and facilitate calculations for large-scale systems. Recently, equivariant message passing neural networks (MPNNs) that explicitly incorporate symmetry constraints have demonstrated promise for interatomic potential and density functional theory (DFT) Hamiltonian predictions. However, the high-order tensors used to represent node and edge information are coupled through the Clebsch-Gordan tensor product (CGTP), leading to steep increases in computational complexity and seriously hindering the performance of equivariant MPNNs. Here, we develop High-order Tensor machine-learning Hamiltonian (Hot-Ham), an E(3) equivariant MPNN framework that combines two advanced technologies local coordinate transformation and Gaunt tensor product (GTP) to efficiently model DFT Hamiltonians. These two innovations significantly reduce the complexity of tensor products from O(L6) to O(L3) or O(L2 log2 L) for the max tensor order L, and enhance the performance of MPNNs. Benchmarks on several public datasets demonstrate its state-of-the-art accuracy with relatively few parameters, and the applications to multilayer twisted moir\'e systems, heterostructures and allotropes showcase its generalization ability and high efficiency. Our Hot-Ham method provides a new perspective for developing efficient equivariant neural networks and would be a promising approach for investigating the electronic properties of large-scale materials systems.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.