Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
96 tokens/sec
Gemini 2.5 Pro Premium
48 tokens/sec
GPT-5 Medium
15 tokens/sec
GPT-5 High Premium
23 tokens/sec
GPT-4o
104 tokens/sec
DeepSeek R1 via Azure Premium
77 tokens/sec
GPT OSS 120B via Groq Premium
466 tokens/sec
Kimi K2 via Groq Premium
201 tokens/sec
2000 character limit reached

Design principles of deep translationally-symmetric neural quantum states for frustrated magnets (2505.03466v1)

Published 6 May 2025 in cond-mat.str-el, cond-mat.dis-nn, and quant-ph

Abstract: Deep neural network quantum states have emerged as a leading method for studying the ground states of quantum magnets. Successful architectures exploit translational symmetry, but the root of their effectiveness and differences between architectures remain unclear. Here, we apply the ConvNext architecture, designed to incorporate elements of transformers into convolutional networks, to quantum many-body ground states. We find that it is remarkably similar to the factored vision transformer, which has been employed successfully for several frustrated spin systems, allowing us to relate this architecture to more conventional convolutional networks. Through a series of numerical experiments we design the ConvNext to achieve greatest performance at lowest computational cost, then apply this network to the Shastry-Sutherland and J1-J2 models, obtaining variational energies comparable to the state of the art, providing a blueprint for network design choices of translationally-symmetric architectures to tackle challenging ground-state problems in frustrated magnetism.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.