Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Probabilistic Generative Transformer Language models for Generative Design of Molecules (2209.09406v1)

Published 20 Sep 2022 in cond-mat.mtrl-sci, cs.LG, and physics.chem-ph

Abstract: Self-supervised neural LLMs have recently found wide applications in generative design of organic molecules and protein sequences as well as representation learning for downstream structure classification and functional prediction. However, most of the existing deep learning models for molecule design usually require a big dataset and have a black-box architecture, which makes it difficult to interpret their design logic. Here we propose Generative Molecular Transformer (GMTransformer), a probabilistic neural network model for generative design of molecules. Our model is built on the blank filling LLM originally developed for text processing, which has demonstrated unique advantages in learning the "molecules grammars" with high-quality generation, interpretability, and data efficiency. Benchmarked on the MOSES datasets, our models achieve high novelty and Scaf compared to other baselines. The probabilistic generation steps have the potential in tinkering molecule design due to their capability of recommending how to modify existing molecules with explanation, guided by the learned implicit molecule chemistry. The source code and datasets can be accessed freely at https://github.com/usccolumbia/GMTransformer

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Lai Wei (68 papers)
  2. Nihang Fu (20 papers)
  3. Yuqi Song (21 papers)
  4. Qian Wang (453 papers)
  5. Jianjun Hu (55 papers)
Citations (10)

Summary

We haven't generated a summary for this paper yet.