Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Transformer-based Generative Model for De Novo Molecular Design (2210.08749v2)

Published 17 Oct 2022 in cs.LG and q-bio.BM

Abstract: In the scope of drug discovery, the molecular design aims to identify novel compounds from the chemical space where the potential drug-like molecules are estimated to be in the order of 1060 - 10100. Since this search task is computationally intractable due to the unbounded search space, deep learning draws a lot of attention as a new way of generating unseen molecules. As we seek compounds with specific target proteins, we propose a Transformer-based deep model for de novo target-specific molecular design. The proposed method is capable of generating both drug-like compounds (without specified targets) and target-specific compounds. The latter are generated by enforcing different keys and values of the multi-head attention for each target. In this way, we allow the generation of SMILES strings to be conditional on the specified target. Experimental results demonstrate that our method is capable of generating both valid drug-like compounds and target-specific compounds. Moreover, the sampled compounds from conditional model largely occupy the real target-specific molecules' chemical space and also cover a significant fraction of novel compounds.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Wenlu Wang (14 papers)
  2. Ye Wang (248 papers)
  3. Honggang Zhao (2 papers)
  4. Simone Sciabola (3 papers)

Summary

We haven't generated a summary for this paper yet.