Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

May the Force be with You: Unified Force-Centric Pre-Training for 3D Molecular Conformations (2308.14759v1)

Published 24 Aug 2023 in physics.chem-ph, cs.AI, cs.LG, and q-bio.BM

Abstract: Recent works have shown the promise of learning pre-trained models for 3D molecular representation. However, existing pre-training models focus predominantly on equilibrium data and largely overlook off-equilibrium conformations. It is challenging to extend these methods to off-equilibrium data because their training objective relies on assumptions of conformations being the local energy minima. We address this gap by proposing a force-centric pretraining model for 3D molecular conformations covering both equilibrium and off-equilibrium data. For off-equilibrium data, our model learns directly from their atomic forces. For equilibrium data, we introduce zero-force regularization and forced-based denoising techniques to approximate near-equilibrium forces. We obtain a unified pre-trained model for 3D molecular representation with over 15 million diverse conformations. Experiments show that, with our pre-training objective, we increase forces accuracy by around 3 times compared to the un-pre-trained Equivariant Transformer model. By incorporating regularizations on equilibrium data, we solved the problem of unstable MD simulations in vanilla Equivariant Transformers, achieving state-of-the-art simulation performance with 2.45 times faster inference time than NequIP. As a powerful molecular encoder, our pre-trained model achieves on-par performance with state-of-the-art property prediction tasks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Rui Feng (67 papers)
  2. Qi Zhu (160 papers)
  3. Huan Tran (14 papers)
  4. Binghong Chen (14 papers)
  5. Aubrey Toland (3 papers)
  6. Rampi Ramprasad (43 papers)
  7. Chao Zhang (907 papers)
Citations (8)

Summary

We haven't generated a summary for this paper yet.