Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Simple, Efficient and Scalable Structure-aware Adapter Boosts Protein Language Models (2404.14850v1)

Published 23 Apr 2024 in cs.CL, cs.LG, and q-bio.BM

Abstract: Fine-tuning Pre-trained protein LLMs (PLMs) has emerged as a prominent strategy for enhancing downstream prediction tasks, often outperforming traditional supervised learning approaches. As a widely applied powerful technique in natural language processing, employing Parameter-Efficient Fine-Tuning techniques could potentially enhance the performance of PLMs. However, the direct transfer to life science tasks is non-trivial due to the different training strategies and data forms. To address this gap, we introduce SES-Adapter, a simple, efficient, and scalable adapter method for enhancing the representation learning of PLMs. SES-Adapter incorporates PLM embeddings with structural sequence embeddings to create structure-aware representations. We show that the proposed method is compatible with different PLM architectures and across diverse tasks. Extensive evaluations are conducted on 2 types of folding structures with notable quality differences, 9 state-of-the-art baselines, and 9 benchmark datasets across distinct downstream tasks. Results show that compared to vanilla PLMs, SES-Adapter improves downstream task performance by a maximum of 11% and an average of 3%, with significantly accelerated training speed by a maximum of 1034% and an average of 362%, the convergence rate is also improved by approximately 2 times. Moreover, positive optimization is observed even with low-quality predicted structures. The source code for SES-Adapter is available at https://github.com/tyang816/SES-Adapter.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (10)
  1. Yang Tan (39 papers)
  2. Mingchen Li (50 papers)
  3. Bingxin Zhou (29 papers)
  4. Bozitao Zhong (12 papers)
  5. Lirong Zheng (11 papers)
  6. Pan Tan (13 papers)
  7. Ziyi Zhou (33 papers)
  8. Huiqun Yu (8 papers)
  9. Guisheng Fan (10 papers)
  10. Liang Hong (67 papers)
Citations (6)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com