Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 168 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 28 tok/s Pro
GPT-5 High 25 tok/s Pro
GPT-4o 122 tok/s Pro
Kimi K2 188 tok/s Pro
GPT OSS 120B 464 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Physics-informed Transformers for Electronic Quantum States (2412.12248v1)

Published 16 Dec 2024 in cond-mat.str-el, cond-mat.dis-nn, and quant-ph

Abstract: Neural-network-based variational quantum states in general, and more recently autoregressive models in particular, have proven to be powerful tools to describe complex many-body wave functions. However, their performance crucially depends on the computational basis chosen and they often lack physical interpretability. To mitigate these issues, we here propose a modified variational Monte-Carlo framework which leverages prior physical information to construct a computational second-quantized basis containing a reference state that serves as a rough approximation to the true ground state. In this basis, a Transformer is used to parametrize and autoregressively sample the corrections to the reference state, giving rise to a more interpretable and computationally efficient representation of the ground state. We demonstrate this approach using a non-sparse fermionic model featuring a metal-insulator transition and employing Hartree-Fock and a strong-coupling limit to define physics-informed bases. We also show that the Transformer's hidden representation captures the natural energetic order of the different basis states. This work paves the way for more efficient and interpretable neural quantum-state representations.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 tweet and received 4 likes.

Upgrade to Pro to view all of the tweets about this paper: