Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Scalable Geometric Deep Learning on Molecular Graphs (2112.03364v1)

Published 6 Dec 2021 in cs.LG, cond-mat.mtrl-sci, and physics.chem-ph

Abstract: Deep learning in molecular and materials sciences is limited by the lack of integration between applied science, artificial intelligence, and high-performance computing. Bottlenecks with respect to the amount of training data, the size and complexity of model architectures, and the scale of the compute infrastructure are all key factors limiting the scaling of deep learning for molecules and materials. Here, we present $\textit{LitMatter}$, a lightweight framework for scaling molecular deep learning methods. We train four graph neural network architectures on over 400 GPUs and investigate the scaling behavior of these methods. Depending on the model architecture, training time speedups up to $60\times$ are seen. Empirical neural scaling relations quantify the model-dependent scaling and enable optimal compute resource allocation and the identification of scalable molecular geometric deep learning model implementations.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Nathan C. Frey (19 papers)
  2. Siddharth Samsi (74 papers)
  3. Joseph McDonald (17 papers)
  4. Lin Li (330 papers)
  5. Connor W. Coley (59 papers)
  6. Vijay Gadepally (131 papers)
Citations (4)

Summary

We haven't generated a summary for this paper yet.