Structure Language Models for Protein Conformation Generation (2410.18403v2)
Abstract: Proteins adopt multiple structural conformations to perform their diverse biological functions, and understanding these conformations is crucial for advancing drug discovery. Traditional physics-based simulation methods often struggle with sampling equilibrium conformations and are computationally expensive. Recently, deep generative models have shown promise in generating protein conformations as a more efficient alternative. However, these methods predominantly rely on the diffusion process within a 3D geometric space, which typically centers around the vicinity of metastable states and is often inefficient in terms of runtime. In this paper, we introduce Structure LLMing (SLM) as a novel framework for efficient protein conformation generation. Specifically, the protein structures are first encoded into a compact latent space using a discrete variational auto-encoder, followed by conditional LLMing that effectively captures sequence-specific conformation distributions. This enables a more efficient and interpretable exploration of diverse ensemble modes compared to existing methods. Based on this general framework, we instantiate SLM with various popular LM architectures as well as proposing the ESMDiff, a novel BERT-like structure LLM fine-tuned from ESM3 with masked diffusion. We verify our approach in various scenarios, including the equilibrium dynamics of BPTI, conformational change pairs, and intrinsically disordered proteins. SLM provides a highly efficient solution, offering a 20-100x speedup than existing methods in generating diverse conformations, shedding light on promising avenues for future research.
- Accurate structure prediction of biomolecular interactions with alphafold 3. Nature, pp. 1–3, 2024.
- Unified rational protein engineering with sequence-based deep representation learning. Nature methods, 16(12):1315–1322, 2019.
- A closer look at memorization in deep networks. In International conference on machine learning, pp. 233–242. PMLR, 2017.
- Two for one: Diffusion models and force fields for coarse-grained molecular dynamics. arXiv preprint arXiv:2302.00600, 2023.
- Structured denoising diffusion models in discrete state-spaces. Advances in Neural Information Processing Systems, 34:17981–17993, 2021.
- Accurate prediction of protein structures and interactions using a three-track neural network. Science, 373(6557):871–876, 2021.
- Structure prediction of alternative protein conformations. Nature Communications, 15(1):7328, 2024.
- A continuous time framework for discrete denoising models. Advances in Neural Information Processing Systems, 35:28266–28279, 2022.
- Alphafold2 fails to predict protein fold switching. Protein Science, 31(6):e4353, 2022.
- Kuo-Chen Chou. Low-frequency motions in protein molecules. beta-sheet and beta-barrel. Biophysical journal, 48(2):289–297, 1985.
- Robust deep learning–based protein sequence design using proteinmpnn. Science, 378(6615):49–56, 2022.
- Sampling alternative conformational states of transporters and receptors with alphafold2. Elife, 11:e75751, 2022.
- Maximum likelihood from incomplete data via the em algorithm. Journal of the royal statistical society: series B (methodological), 39(1):1–22, 1977.
- Jacob Devlin. Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805, 2018.
- Nanobodies as probes for protein dynamics in vitro and in cells. Journal of Biological Chemistry, 291(8):3767–3775, 2016.
- Prottrans: Toward understanding the language of life through self-supervised learning. IEEE transactions on pattern analysis and machine intelligence, 44(10):7112–7127, 2021.
- Protgpt2 is a deep unsupervised language model for protein design. Nature communications, 13(1):4348, 2022.
- Pifold: Toward effective and efficient protein inverse folding. arXiv preprint arXiv:2209.12643, 2022.
- Foldtoken: Learning protein language via vector quantization and beyond. arXiv preprint arXiv:2403.09673, 2024.
- Learning the language of protein structure, 2024. URL https://arxiv.org/abs/2405.15840.
- Diffusion in a quantized vector space generates non-idealized protein structures and predicts conformational distributions. bioRxiv, pp. 2023–11, 2023.
- Simulating 500 million years of evolution with a language model. bioRxiv, pp. 2024–07, 2024.