Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Approximating power of machine-learning ansatz for quantum many-body states (1901.08615v1)

Published 24 Jan 2019 in cond-mat.dis-nn, cond-mat.str-el, and quant-ph

Abstract: An artificial neural network (ANN) with the restricted Boltzmann machine (RBM) architecture was recently proposed as a versatile variational quantum many-body wave function. In this work we provide physical insights into the performance of this ansatz. We uncover the connection between the structure of RBM and perturbation series, which explains the excellent precision achieved by RBM ansazt in certain simple models, demonstrated in the literature. Based on this relation, we improve the numerical algorithm to achieve better performance of RBM in cases where local minima complicate the convergence to the global one. We introduce other classes of variational wave-functions, which are also capable of reproducing the perturbative structure, and show that their performance is comparable to that of RBM. Furthermore, we study the performance of a few-layer RBM for approximating ground states of random, translationally-invariant models in 1d, as well as random matrix-product states (MPS). We find that the error in approximating such states exhibits a broad distribution, and is largely determined by the entanglement properties of the targeted state.

Summary

We haven't generated a summary for this paper yet.