Papers
Topics
Authors
Recent
Search
2000 character limit reached

Feature Selection for High-Dimensional Neural Network Potentials with the Adaptive Group Lasso

Published 26 Dec 2023 in cond-mat.dis-nn | (2312.15979v1)

Abstract: Neural network potentials are a powerful tool for atomistic simulations, allowing to accurately reproduce \textit{ab initio} potential energy surfaces with computational performance approaching classical force fields. A central component of such potentials is the transformation of atomic positions into a set of atomic features in a most efficient and informative way.In this work, a feature selection method is introduced for high dimensional neural network potentials, based on the Adaptive Group Lasso (AGL) approach. It is shown that the use of an embedded method, taking into account the interplay between features and their action in the estimator, is necessary to optimize the number of features. The method's efficiency is tested on three different monoatomic systems, including Lennard-Jones as a simple test case, Aluminium as a system characterized by predominantly radial interactions, and Boron as representative of a system with strongly directional interactions. The AGL is compared with unsupervised filter methods and found to perform consistently better in reducing the number of features needed to reproduce the reference simulation data. {In particular, our results show the importance of taking into account model predictions in feature selection for interatomic potentials.

Citations (1)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.