Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
121 tokens/sec
GPT-4o
9 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Incorporating long-range physics in atomic-scale machine learning (1909.04512v1)

Published 10 Sep 2019 in physics.chem-ph

Abstract: The most successful and popular machine learning models of atomic-scale properties derive their transferability from a locality ansatz. The properties of a large molecule or a bulk material are written as a sum over contributions that depend on the configurations within finite atom-centered environments. The obvious downside of this approach is that it cannot capture non-local, non-additive effects such as those arising due to long-range electrostatics or quantum interference. We propose a solution to this problem by introducing non-local representations of the system that are remapped as feature vectors that are defined locally and are equivariant in O(3). We consider in particular one form that has the same asymptotic behavior as the electrostatic potential. We demonstrate that this framework can capture non-local, long-range physics by building a model for the electrostatic energy of randomly distributed point-charges, for the unrelaxed binding curves of charged organic molecular dimers, and for the electronic dielectric response of liquid water. By combining a representation of the system that is sensitive to long-range correlations with the transferability of an atom-centered additive model, this method outperforms current state-of-the-art machine-learning schemes, and provides a conceptual framework to incorporate non-local physics into atomistic machine learning.

Summary

We haven't generated a summary for this paper yet.