Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

ETDock: A Novel Equivariant Transformer for Protein-Ligand Docking (2310.08061v1)

Published 12 Oct 2023 in q-bio.BM and cs.LG

Abstract: Predicting the docking between proteins and ligands is a crucial and challenging task for drug discovery. However, traditional docking methods mainly rely on scoring functions, and deep learning-based docking approaches usually neglect the 3D spatial information of proteins and ligands, as well as the graph-level features of ligands, which limits their performance. To address these limitations, we propose an equivariant transformer neural network for protein-ligand docking pose prediction. Our approach involves the fusion of ligand graph-level features by feature processing, followed by the learning of ligand and protein representations using our proposed TAMformer module. Additionally, we employ an iterative optimization approach based on the predicted distance matrix to generate refined ligand poses. The experimental results on real datasets show that our model can achieve state-of-the-art performance.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Yiqiang Yi (2 papers)
  2. Xu Wan (7 papers)
  3. Yatao Bian (60 papers)
  4. Le Ou-Yang (6 papers)
  5. Peilin Zhao (127 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.