Dice Question Streamline Icon: https://streamlinehq.com

Finite-size scaling of neural ansatz performance in solid-state simulations

Characterize how the performance—including accuracy, convergence behavior, and the required number of variational parameters—of neural-network wavefunction ansatzes for interacting electrons changes as the system size increases in periodic solid-state simulations, thereby quantifying finite-size effects.

Information Square Streamline Icon: https://streamlinehq.com

Background

Finite-size effects are crucial in numerical simulations of solids due to periodic boundary conditions and supercell approximations. The paper introduces a self-attention-based neural ansatz and later reports an empirical parameter scaling N_par ∝ N2.1 for moiré systems, suggesting favorable scaling compared to some tensor-network approaches.

The open question explicitly posed in the Introduction asks for a general understanding of how performance evolves with system size, beyond specific empirical observations, to guide reliable large-scale applications.

References

Despite the rapid progress, two important questions remain open. Second, it is essential to assess the finite size effect in numerical simulations of solid state systems. How does the performance of the neural ansatz change as the system size increases?

Is attention all you need to solve the correlated electron problem? (2502.05383 - Geier et al., 7 Feb 2025) in Section 1 (Introduction)