Papers
Topics
Authors
Recent
Search
2000 character limit reached

SchNet - a deep learning architecture for molecules and materials

Published 17 Dec 2017 in physics.chem-ph and cond-mat.mtrl-sci | (1712.06113v3)

Abstract: Deep learning has led to a paradigm shift in artificial intelligence, including web, text and image search, speech recognition, as well as bioinformatics, with growing impact in chemical physics. Machine learning in general and deep learning in particular is ideally suited for representing quantum-mechanical interactions, enabling to model nonlinear potential-energy surfaces or enhancing the exploration of chemical compound space. Here we present the deep learning architecture SchNet that is specifically designed to model atomistic systems by making use of continuous-filter convolutional layers. We demonstrate the capabilities of SchNet by accurately predicting a range of properties across chemical space for \emph{molecules and materials} where our model learns chemically plausible embeddings of atom types across the periodic table. Finally, we employ SchNet to predict potential-energy surfaces and energy-conserving force fields for molecular dynamics simulations of small molecules and perform an exemplary study of the quantum-mechanical properties of C$_{20}$-fullerene that would have been infeasible with regular ab initio molecular dynamics.

Citations (1,532)

Summary

  • The paper introduces SchNet, a deep learning model that predicts quantum-mechanical interactions using continuous-filter convolutional layers.
  • The architecture learns atom embeddings directly from atomic positions, integrating symmetry invariances essential for accurate molecular simulations.
  • Numerical results demonstrate SchNet’s reduced prediction errors and improved performance over traditional neural network models in quantum chemistry.

Summary of "SchNet - a deep learning architecture for molecules and materials" (1712.06113)

The paper introduces SchNet, a deep learning model specifically designed for atomic-scale modeling of molecules and materials. The architecture incorporates continuous-filter convolutional layers, effectively allowing the prediction of quantum-mechanical interactions, potential-energy surfaces (PESs), and energy-conserving force fields for molecular dynamics simulations. Unlike traditional approaches requiring handcrafted descriptors, SchNet learns representations directly from atomic positions and types, integrating known chemical symmetries by design.

Architecture and Methodology

SchNet is an evolution of Deep Tensor Neural Networks (DTNN) with significant improvements in modeling atomic interactions via continuous-filter convolutional layers. These layers enable SchNet to compute atom-wise contributions efficiently while maintaining invariances such as rotational, translational, and permutational symmetry which are essential for quantum chemistry applications. The architecture includes atom embeddings initialized based on atomic types and refined through interaction blocks.

The interaction layers utilize filter-generating networks, modeling interatomic effects via learned filters in chemical space. These networks incorporate known atomic symmetries and periodic boundary conditions to constrain and guide learning efficiently. The atom-wise layers within SchNet contribute to the continuous feature refinement, supporting accurate predictions of a wide range of material properties.

Numerical Results and Analysis

SchNet demonstrates its capability across several benchmark datasets. For QM9, a dataset of small organic molecules, SchNet yields low mean absolute errors, outperforming other neural network models in predicting molecular properties such as HOMO, LUMO, and polarizabilities. In the Materials Project dataset, SchNet accurately predicts formation energies across 89 atom types, showcasing scalability and adaptability to complex systems.

A particularly novel aspect of SchNet is its ability to learn atom-type embeddings that reflect the periodic table structure, implying that the network recovers fundamental chemical groupings from data alone. Additionally, the representation provides local chemical potentials that enable deeper insights into atomic environments and potential reactivity.

Molecular Dynamics Application

The paper applies SchNet to predict PESs for molecular dynamics simulations, particularly focusing on the MD17 dataset and a C20_{20}-fullerene study. When trained with energy and atomic forces, SchNet shows significant improvements in both small molecule trajectories and large molecular systems. The combination of speed and accuracy presents a viable alternative to traditional ab initio methods, reducing simulation times drastically while maintaining high fidelity to quantum mechanical properties.

For C20_{20}-fullerene, SchNet supports path-integral molecular dynamics (PIMD) simulations, capturing nuclear quantum effects efficiently. This represents an innovative use of machine learning to enable extensive simulations previously infeasible due to computational costs, with implications for exploring nanoseconds of molecular dynamics.

Implications and Future Directions

SchNet contributes to the field of computational chemistry by providing a flexible yet rigorous tool for simulating and understanding quantum-chemical interactions in large, complex systems. The architecture’s built-in symmetries and capacity for learning direct from data exemplify the shift from descriptor-based to data-driven modeling approaches. The potential for integrating further chemical knowledge through filter-generating networks opens avenues for improved accuracy and interpretability in simulations.

Going forward, SchNet could be extended to larger periodic systems and refined with interpretable output layers specific to certain chemical properties. Its effectiveness in combining energies and forces could lead to advancements in studying vibrational modes and mechanical properties at high precision.

In summary, SchNet represents a significant step in applying deep learning to quantum chemistry, offering enhanced performance for property predictions and molecular dynamics simulations, and setting the stage for further advancements in machine learning applications in material science.

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 1 like about this paper.