Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Generative models for scalar field theories: how to deal with poor scaling? (2301.01504v1)

Published 4 Jan 2023 in hep-lat and cs.LG

Abstract: Generative models, such as the method of normalizing flows, have been suggested as alternatives to the standard algorithms for generating lattice gauge field configurations. Studies with the method of normalizing flows demonstrate the proof of principle for simple models in two dimensions. However, further studies indicate that the training cost can be, in general, very high for large lattices. The poor scaling traits of current models indicate that moderate-size networks cannot efficiently handle the inherently multi-scale aspects of the problem, especially around critical points. We explore current models with limited acceptance rates for large lattices and examine new architectures inspired by effective field theories to improve scaling traits. We also discuss alternative ways of handling poor acceptance rates for large lattices.

Citations (5)

Summary

We haven't generated a summary for this paper yet.