Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Improved Drug-target Interaction Prediction with Intermolecular Graph Transformer (2110.07347v2)

Published 14 Oct 2021 in cs.LG and q-bio.QM

Abstract: The identification of active binding drugs for target proteins (termed as drug-target interaction prediction) is the key challenge in virtual screening, which plays an essential role in drug discovery. Although recent deep learning-based approaches achieved better performance than molecular docking, existing models often neglect certain aspects of the intermolecular information, hindering the performance of prediction. We recognize this problem and propose a novel approach named Intermolecular Graph Transformer (IGT) that employs a dedicated attention mechanism to model intermolecular information with a three-way Transformer-based architecture. IGT outperforms state-of-the-art approaches by 9.1% and 20.5% over the second best for binding activity and binding pose prediction respectively, and shows superior generalization ability to unseen receptor proteins. Furthermore, IGT exhibits promising drug screening ability against SARS-CoV-2 by identifying 83.1% active drugs that have been validated by wet-lab experiments with near-native predicted binding poses.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (9)
  1. Siyuan Liu (68 papers)
  2. Yusong Wang (20 papers)
  3. Tong Wang (144 papers)
  4. Yifan Deng (11 papers)
  5. Liang He (202 papers)
  6. Bin Shao (61 papers)
  7. Jian Yin (67 papers)
  8. Nanning Zheng (146 papers)
  9. Tie-Yan Liu (242 papers)
Citations (12)

Summary

We haven't generated a summary for this paper yet.