Papers
Topics
Authors
Recent
2000 character limit reached

Efficient Learning of a One-dimensional Density Functional Theory

Published 6 May 2020 in cond-mat.dis-nn and cond-mat.str-el | (2005.03014v2)

Abstract: Density functional theory underlies the most successful and widely used numerical methods for electronic structure prediction of solids. However, it has the fundamental shortcoming that the universal density functional is unknown. In addition, the computational result---energy and charge density distribution of the ground state---is useful for electronic properties of solids mostly when reduced to a band structure interpretation based on the Kohn-Sham approach. Here, we demonstrate how machine learning algorithms can help to free density functional theory from these limitations. We study a theory of spinless fermions on a one-dimensional lattice. The density functional is implicitly represented by a neural network, which predicts, besides the ground-state energy and density distribution, density-density correlation functions. At no point do we require a band structure interpretation. The training data, obtained via exact diagonalization, feeds into a learning scheme inspired by active learning, which minimizes the computational costs for data generation. We show that the network results are of high quantitative accuracy and, despite learning on random potentials, capture both symmetry-breaking and topological phase transitions correctly.

Citations (8)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.