Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Protein Design with Guided Discrete Diffusion (2305.20009v2)

Published 31 May 2023 in cs.LG and q-bio.BM

Abstract: A popular approach to protein design is to combine a generative model with a discriminative model for conditional sampling. The generative model samples plausible sequences while the discriminative model guides a search for sequences with high fitness. Given its broad success in conditional sampling, classifier-guided diffusion modeling is a promising foundation for protein design, leading many to develop guided diffusion models for structure with inverse folding to recover sequences. In this work, we propose diffusioN Optimized Sampling (NOS), a guidance method for discrete diffusion models that follows gradients in the hidden states of the denoising network. NOS makes it possible to perform design directly in sequence space, circumventing significant limitations of structure-based methods, including scarce data and challenging inverse design. Moreover, we use NOS to generalize LaMBO, a Bayesian optimization procedure for sequence design that facilitates multiple objectives and edit-based constraints. The resulting method, LaMBO-2, enables discrete diffusions and stronger performance with limited edits through a novel application of saliency maps. We apply LaMBO-2 to a real-world protein design task, optimizing antibodies for higher expression yield and binding affinity to several therapeutic targets under locality and developability constraints, attaining a 99% expression rate and 40% binding rate in exploratory in vitro experiments.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (83)
  1. The rosetta all-atom energy function for macromolecular modeling and design. Journal of chemical theory and computation, 13(6):3031–3048, 2017.
  2. Protein structure and sequence generation with equivariant denoising diffusion probabilistic models. arXiv preprint arXiv:2205.15019, 2022.
  3. Charles Audet. A survey on direct search methods for blackbox optimization and their applications. Springer, 2014.
  4. Structured denoising diffusion models in discrete state-spaces. Advances in Neural Information Processing Systems, 34:17981–17993, 2021.
  5. How to explain individual classification decisions. The Journal of Machine Learning Research, 11:1803–1831, 2010.
  6. " will you find these shortcuts?" a protocol for evaluating the faithfulness of input salience methods for text classification. arXiv preprint arXiv:2111.07367, 2021.
  7. Efficient training of language models to fill in the middle. arXiv preprint arXiv:2207.14255, 2022.
  8. Generalization in nli: Ways (not) to go beyond simple heuristics, 2021.
  9. Introduction to protein structure. Garland Science, 2012.
  10. Protein data bank (pdb): the single global macromolecular structure archive. Protein crystallography: methods and protocols, pages 627–641, 2017.
  11. Benchmarking interpretability tools for deep neural networks. arXiv preprint arXiv:2302.10894, 2023.
  12. Muse: Text-to-image generation via masked generative transformers. arXiv preprint arXiv:2301.00704, 2023.
  13. Generating long sequences with sparse transformers. arXiv preprint arXiv:1904.10509, 2019.
  14. Improving diffusion models for inverse problems using manifold constraints. arXiv preprint arXiv:2206.00941, 2022.
  15. Biopython: freely available python tools for computational molecular biology and bioinformatics. Bioinformatics, 25(11):1422–1423, 2009.
  16. Flashattention: Fast and memory-efficient exact attention with io-awareness. Advances in Neural Information Processing Systems, 35:16344–16359, 2022.
  17. Plug and play language models: A simple approach to controlled text generation. arXiv preprint arXiv:1912.02164, 2019.
  18. Differentiable Expected Hypervolume Improvement for Parallel Multi-Objective Bayesian Optimization. Advances in Neural Information Processing Systems, 33, 2020.
  19. Robust deep learning–based protein sequence design using proteinmpnn. Science, 378(6615):49–56, 2022.
  20. Diffusion models beat gans on image synthesis. Advances in Neural Information Processing Systems, 34:8780–8794, 2021.
  21. Continuous diffusion for categorical data. arXiv preprint arXiv:2211.15089, 2022.
  22. Theory of evolutionary computation: Recent developments in discrete optimization. 2019.
  23. Adaptive subgradient methods for online learning and stochastic optimization. Journal of machine learning research, 12(7), 2011.
  24. Anarci: antigen receptor numbering and receptor classification. Bioinformatics, 32(2):298–300, 2016.
  25. Sabdab: the structural antibody database. Nucleic acids research, 42(D1):D1140–D1146, 2014.
  26. Plug & play directed evolution of proteins with gradient-based discrete mcmc. Machine Learning: Science and Technology, 4(2):025014, 2023.
  27. Michael Emmerich. Single-and multi-objective evolutionary design optimization assisted by gaussian random field metamodels. dissertation, Universität Dortmund, 2005.
  28. Hypervolume-based expected improvement: Monotonicity properties and exact computation. In 2011 IEEE Congress of Evolutionary Computation (CEC), pages 2147–2154. IEEE, 2011.
  29. Protgpt2 is a deep unsupervised language model for protein design. Nature communications, 13(1):1–10, 2022.
  30. Learning protein family manifolds with smoothed energy-based models. In ICLR 2023 Physics4ML Workshop, 2023. URL https://openreview.net/forum?id=IilnB8jfoP9. Spotlight presentation.
  31. Mask-predict: Parallel decoding of conditional masked language models. arXiv preprint arXiv:1904.09324, 2019.
  32. Function-guided protein design by deep manifold sampling. bioRxiv, 2021.
  33. Automatic chemical design using a data-driven continuous representation of molecules. ACS central science, 4(2):268–276, 2018.
  34. Diffusion models as plug-and-play priors. arXiv preprint arXiv:2206.09012, 2022.
  35. On the parameterization and initialization of diagonal state space models. Advances in Neural Information Processing Systems, 35:35971–35983, 2022.
  36. Her2: biology, detection, and clinical implications. Archives of pathology & laboratory medicine, 135(1):55–62, 2011.
  37. A high-level programming language for generative protein design. bioRxiv, pages 2022–12, 2022.
  38. Classifier-free diffusion guidance. arXiv preprint arXiv:2207.12598, 2022.
  39. Denoising diffusion probabilistic models. Advances in Neural Information Processing Systems, 33:6840–6851, 2020.
  40. Argmax flows and multinomial diffusion: Learning categorical distributions. Advances in Neural Information Processing Systems, 34:12454–12465, 2021.
  41. Evaluating feature importance estimates. web preprint https://research.google/pubs/pub47088/, 2018.
  42. Machine learning for perturbational single-cell omics. Cell Systems, 12(6):522–537, 2021.
  43. A penultimate classification of canonical antibody cdr conformations. bioRxiv, pages 2022–10, 2022.
  44. Construction of a rationally designed antibody platform for sequencing-assisted selection. Proceedings of the National Academy of Sciences, 109(45):18523–18528, 2012.
  45. Proteinsgm: Score-based generative modeling for de novo protein design. bioRxiv, 2022.
  46. Sergey Levine. Reinforcement learning and control as probabilistic inference: Tutorial and review. arXiv preprint arXiv:1805.00909, 2018.
  47. Diffusion-lm improves controllable text generation. arXiv preprint arXiv:2205.14217, 2022.
  48. Ring attention with blockwise transformers for near-infinite context. arXiv preprint arXiv:2310.01889, 2023a.
  49. A text-guided protein design framework. arXiv preprint arXiv:2302.04611, 2023b.
  50. Repaint: Inpainting using denoising diffusion probabilistic models. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 11461–11471, 2022.
  51. Antigen-specific antibody design and optimization with diffusion-based generative models. bioRxiv, 2022.
  52. The application of next generation sequencing to the understanding of antibody repertoires. Frontiers in immunology, 4:265, 2013.
  53. Benchmarking deep generative models for diverse antibody sequence design. arXiv preprint arXiv:2111.06801, 2021.
  54. Plug & play generative networks: Conditional iterative generation of images in latent space. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 4467–4477, 2017.
  55. Improved denoising diffusion probabilistic models. In International Conference on Machine Learning, pages 8162–8171. PMLR, 2021.
  56. Observed antibody space: A diverse database of cleaned, annotated, and translated unpaired and paired antibody sequences. Protein Science, 31(1):141–146, 2022.
  57. Resurrecting recurrent neural networks for long sequences. arXiv preprint arXiv:2303.06349, 2023.
  58. Extrapolative controlled sequence generation via iterative refinement. arXiv preprint arXiv:2303.04562, 2023.
  59. Hyena hierarchy: Towards larger convolutional language models. arXiv preprint arXiv:2302.10866, 2023.
  60. Diffuser: Discrete diffusion via edit-based reconstruction. arXiv preprint arXiv:2210.16886, 2022.
  61. Proximal exploration for model-guided protein sequence design. In International Conference on Machine Learning, pages 18520–18536. PMLR, 2022.
  62. High-resolution image synthesis with latent diffusion models. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 10684–10695, 2022.
  63. Exploring protein fitness landscapes by directed evolution. Nature reviews Molecular cell biology, 10(12):866–876, 2009.
  64. Fast, accurate antibody structure prediction from deep learning on massive set of natural antibodies. Biophysical Journal, 121(3):155a–156a, 2022.
  65. Step-unrolled denoising autoencoders for text generation, 2021.
  66. Do input gradients highlight discriminative features? Advances in Neural Information Processing Systems, 34:2046–2059, 2021.
  67. Unlocking de novo antibody design with generative artificial intelligence. bioRxiv, pages 2023–01, 2023.
  68. Generative language modeling for antibody design. bioRxiv, 2021.
  69. Deep inside convolutional networks: Visualising image classification models and saliency maps. arXiv preprint arXiv:1312.6034, 2013.
  70. Adalead: A simple and robust adaptive greedy search algorithm for sequence design. arXiv preprint arXiv:2010.02141, 2020.
  71. Where to diffuse, how to diffuse, and how to get back: Automated learning for multivariate diffusions. arXiv preprint arXiv:2302.07261, 2023.
  72. Efficient training of low-curvature neural networks. Advances in Neural Information Processing Systems, 35:25951–25964, 2022.
  73. Accelerating bayesian optimization for biological sequence design with denoising autoencoders. arXiv preprint arXiv:2203.12742, 2022.
  74. Self-conditioned embedding diffusion for text generation. arXiv preprint arXiv:2211.04236, 2022.
  75. Uniref: comprehensive and non-redundant uniprot reference clusters. Bioinformatics, 23(10):1282–1288, 2007.
  76. Rethinking the inception architecture for computer vision. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 2818–2826, 2016.
  77. Diffusion probabilistic modeling of protein backbones in 3d for the motif-scaffolding problem. arXiv preprint arXiv:2206.04119, 2022.
  78. Language models generalize beyond natural proteins. bioRxiv, 2022.
  79. Digress: Discrete denoising diffusion for graph generation. arXiv preprint arXiv:2209.14734, 2022.
  80. Broadly applicable and accurate protein design by integrating structure prediction networks and diffusion generative models. bioRxiv, pages 2022–12, 2022.
  81. Bayesian deep learning and a probabilistic perspective of generalization. Advances in neural information processing systems, 33:4697–4708, 2020.
  82. The reparameterization trick for acquisition functions. arXiv preprint arXiv:1712.00424, 2017.
  83. Fudge: Controlled text generation with future discriminators. arXiv preprint arXiv:2104.05218, 2021.
Citations (76)

Summary

  • The paper introduces DiffusioN Optimized Sampling (NOS) to guide discrete diffusion models for efficient protein sequence optimization.
  • The paper extends LaMBO to LaMBO-2 by integrating saliency maps to pinpoint optimal sequence edits for multi-objective design.
  • The paper demonstrates impressive results with a 99% expression rate and a 40% binding rate in antibody sequence optimization.

An Analysis of "Protein Design with Guided Discrete Diffusion"

The paper "Protein Design with Guided Discrete Diffusion" explores a novel approach to protein design by leveraging discrete diffusion models. The authors propose a framework called diffusioN Optimized Sampling (NOS) to guide sampling in discrete diffusion models directly in sequence space. This approach contrasts with structure-based protein design methods that often depend on structural data and inverse folding, which can be limited by data availability and computational expense.

Key Contributions

  1. DiffusioN Optimized Sampling (NOS): The authors introduce NOS, a method that uses hidden states of denoising networks for gradient-guided sampling in discrete diffusion models. This allows for efficient navigation through the vast protein sequence space to locate sequences that not only exhibit high fitness but can also be sampled quickly during inference.
  2. LaMBO-2: An extension of the Bayesian optimization framework LaMBO, LaMBO-2 integrates NOS to handle multiple objectives in sequence design. A notable addition is the use of saliency maps to identify optimal edit positions, ensuring better resource allocation in sequence modification.
  3. Practical Application and Results: The authors demonstrate the potential of LaMBO-2 through the optimization of antibody sequences for improved expression yields and binding affinities. The ability to enhance binding affinity while maintaining high naturalness of sequences was evidenced, with LaMBO-2 producing a 99% expression rate and a 40% binding rate in experimental settings.

Numerical Strengths and Results

The experimental results presented in the paper significantly underscore the efficacy of the NOS and LaMBO-2 frameworks. The authors showcase impressive success rates: a 99% expression rate and a 40% binding rate in vitro. These metrics highlight the framework's capability to generate functional sequences with high likelihood of success. Moreover, the guided sampling and optimization strategies demonstrated consistent improvements over genetic algorithms and unguided search methods, in both single and multiple objective settings.

Theoretical and Practical Implications

The introduction of NOS and its application in LaMBO-2 presents a notable shift in protein design paradigms. By focusing on sequence space rather than structure space, the authors tackle fundamental challenges in protein optimization such as data scarcity and computational limitations associated with structural models.

From a theoretical perspective, this work pushes the boundaries of sequence-based protein design by providing a mechanism to incorporate gradient-based guidance effectively. This could spur further research into improved methodologies for discrete optimization problems found in other domains of computational biology and engineering.

Practically, the potential applications in drug design, specifically antibody engineering, are profound. The ability to efficiently generate high-fitness antibody libraries in silico without extensive high-throughput screening marks a pivotal advancement. This could significantly reduce the time and cost associated with the development of therapeutic antibodies.

Future Developments

The methods developed in this paper enhance our capabilities in protein design, but there remain several avenues for future work. Extending these approaches to handle more complex objectives, further optimizing the computational efficiency of the framework, and applying these models to other types of biomolecules, such as nucleic acids, are promising directions.

Additionally, exploring hybrid models that integrate both sequence and structure information more efficiently could capitalize on the strengths of NOS and existing structural methods. Finally, given the rapid advancements in computational hardware and algorithms for deep learning, leveraging increased model complexity and dataset sizes could further boost the effectiveness of this approach.

In summary, the paper presents a comprehensive strategy for protein sequence optimization using guided discrete diffusion models, offering significant insights and results that advance the field of computational protein design.

Github Logo Streamline Icon: https://streamlinehq.com
Youtube Logo Streamline Icon: https://streamlinehq.com