Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Tailoring Molecules for Protein Pockets: a Transformer-based Generative Solution for Structured-based Drug Design (2209.06158v1)

Published 30 Aug 2022 in q-bio.BM and cs.LG

Abstract: Structure-based drug design is drawing growing attentions in computer-aided drug discovery. Compared with the virtual screening approach where a pre-defined library of compounds are computationally screened, de novo drug design based on the structure of a target protein can provide novel drug candidates. In this paper, we present a generative solution named TamGent (Target-aware molecule generator with Transformer) that can directly generate candidate drugs from scratch for a given target, overcoming the limits imposed by existing compound libraries. Following the Transformer framework (a state-of-the-art framework in deep learning), we design a variant of Transformer encoder to process 3D geometric information of targets and pre-train the Transformer decoder on 10 million compounds from PubChem for candidate drug generation. Systematical evaluation on candidate compounds generated for targets from DrugBank shows that both binding affinity and drugability are largely improved. TamGent outperforms previous baselines in terms of both effectiveness and efficiency. The method is further verified by generating candidate compounds for the SARS-CoV-2 main protease and the oncogenic mutant KRAS G12C. The results show that our method not only re-discovers previously verified drug molecules , but also generates novel molecules with better docking scores, expanding the compound pool and potentially leading to the discovery of novel drugs.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (10)
  1. Kehan Wu (6 papers)
  2. Yingce Xia (53 papers)
  3. Yang Fan (27 papers)
  4. Pan Deng (11 papers)
  5. Haiguang Liu (13 papers)
  6. Lijun Wu (113 papers)
  7. Shufang Xie (29 papers)
  8. Tong Wang (144 papers)
  9. Tao Qin (201 papers)
  10. Tie-Yan Liu (242 papers)
Citations (2)