Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 156 tok/s
Gemini 2.5 Pro 46 tok/s Pro
GPT-5 Medium 23 tok/s Pro
GPT-5 High 25 tok/s Pro
GPT-4o 58 tok/s Pro
Kimi K2 187 tok/s Pro
GPT OSS 120B 435 tok/s Pro
Claude Sonnet 4.5 39 tok/s Pro
2000 character limit reached

Cormorant: Covariant Molecular Neural Networks (1906.04015v3)

Published 6 Jun 2019 in physics.comp-ph, cs.LG, and stat.ML

Abstract: We propose Cormorant, a rotationally covariant neural network architecture for learning the behavior and properties of complex many-body physical systems. We apply these networks to molecular systems with two goals: learning atomic potential energy surfaces for use in Molecular Dynamics simulations, and learning ground state properties of molecules calculated by Density Functional Theory. Some of the key features of our network are that (a) each neuron explicitly corresponds to a subset of atoms; (b) the activation of each neuron is covariant to rotations, ensuring that overall the network is fully rotationally invariant. Furthermore, the non-linearity in our network is based upon tensor products and the Clebsch-Gordan decomposition, allowing the network to operate entirely in Fourier space. Cormorant significantly outperforms competing algorithms in learning molecular Potential Energy Surfaces from conformational geometries in the MD-17 dataset, and is competitive with other methods at learning geometric, energetic, electronic, and thermodynamic properties of molecules on the GDB-9 dataset.

Citations (398)

Summary

  • The paper introduces Cormorant, a neural network that employs spherical tensors and Clebsch–Gordan products to enforce rotational invariance in molecular modeling.
  • It achieves superior performance on the MD-17 and QM-9 datasets by accurately predicting molecular potential energies and electronic properties, outperforming models like DeepMD and SchNet.
  • Results suggest that integrating physical symmetries into neural network design enhances generalization and reduces the need for manual feature engineering in complex molecular systems.

Covariant Molecular Neural Networks: An Examination of Cormorant

In this paper, the authors introduce Cormorant, an advanced neural network architecture designed explicitly for molecular systems. The architecture is engineered to capture covariant behavior under rotational transformations, thereby addressing a long-standing challenge in molecular modeling: the need to maintain rotational invariance while learning molecular properties and dynamics.

Key Features and Methodology

Cormorant is constructed using a unique formulation where each neuron’s activation corresponds to spherical tensors—mathematical structures adept at handling rotation—instead of more traditional scalar or vector activations. This choice ensures that the network's output is invariant to any global rotational transformations applied to the input data. The spherical tensor use draws from the representation theory of groups, specifically exploiting the properties of Wigner D-matrices, which serve as the irreducible representations of the three-dimensional rotation group $\SO(3)$.

Moreover, Cormorant utilizes Clebsch-Gordan products as a novel form of non-linearity within the network. By operating in Fourier space, the Clebsch-Gordan decomposition facilitates the combination of spherical tensors. This approach allows the model to learn complex interactions—akin to physical interactions like dipole-dipole and quadropole-quadropole interactions—without explicit handcrafted feature engineering, leveraging the neural network’s capability to approximate elaborate mathematical functions through learned representations.

In terms of network structure, Cormorant aggregates two-body interactions and one-body interactions derived from molecular configurations, facilitating the extraction of physically interpretable features. The unique setup—where a neuron corresponds either to an individual atom or a meaningful subset of atoms—encourages the network to learn interactions akin to physical laws.

Numerical Results and Comparative Analysis

In evaluating Cormorant's performance, the authors applied it to two primary datasets: MD-17 for molecular dynamics and QM-9 for ground state property prediction. In the MD-17 task, Cormorant achieved superior performance compared to existing models, such as DeepMD and SchNet, on the prediction of molecular potential energy surfaces. This result underscores the significant promise of using neural networks with inherent physical symmetry considerations.

On the QM-9 dataset, Cormorant demonstrated competitive results, outperforming or matching state-of-the-art approaches in several property predictions, including the dipole moment (μ\mu) and electronic energy gaps (Δϵ\Delta \epsilon). Where it did not lead, it still performed within a reasonable range of the state-of-the-art models such as SchNet and MPNNs, indicating robust generalization capabilities across different molecular properties.

Implications and Future Directions

Practically, the findings suggest that neural network architectures capable of learning with an intrinsic awareness of physical symmetries can offer improvements over traditional machine learning approaches in molecular science. The ability of Cormorant to generalize from the nature of interactions driven by quantum mechanical symmetry, without exhaustive feature engineering, broadens its applicability across different molecular datasets and potentially more complex systems.

Theoretically, the successful implementation of Clebsch-Gordan-based non-linearities paves the way for future research on neural networks over compact symmetry groups. This line of inquiry may extend to other types of gauge symmetries, possibly refining models in fields extending beyond molecular dynamics.

Future work may explore the integration of Cormorant-trained potentials into molecular dynamics simulations, enhancing simulation accuracy and efficiency. Additionally, extending this approach to other symmetry groups, beyond $\SO(3)$, may allow for new applications in materials science or other domains where rotational or translational invariance underpins system behavior. The critical next steps involve further testing and refinement to ensure that these neural network models can predict properties for larger and more complex systems with consistent accuracy.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.