Papers
Topics
Authors
Recent
Search
2000 character limit reached

DeepLNE++ leveraging knowledge distillation for accelerated multi-state path-like collective variables

Published 5 Jul 2024 in physics.chem-ph | (2407.04376v1)

Abstract: Path-like collective variables can be very effective for accurately modeling complex biomolecular processes in molecular dynamics simulations. Recently, we introduced DeepLNE, a machine learning-based path-like CV that provides a progression variable s along the path as a non-linear combination of several descriptors, effectively approximating the reaction coordinate. However, DeepLNE is computationally expensive for realistic systems needing many descriptors and limited in its ability to handle multi-state reactions. Here we present DeepLNE++, which uses a knowledge distillation approach to significantly accelerate the evaluation of DeepLNE, making it feasible to compute free energy landscapes for large and complex biomolecular systems. In addition, DeepLNE++ encodes system-specific knowledge within a supervised multitasking framework, enhancing its versatility and effectiveness.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.