Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
116 tokens/sec
GPT-4o
10 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
3 tokens/sec
DeepSeek R1 via Azure Pro
55 tokens/sec
2000 character limit reached

Scaling and Data Saturation in Protein Language Models (2507.22210v1)

Published 29 Jul 2025 in q-bio.QM

Abstract: Data in biology is redundant, noisy, and sparse. How does the type and scale of available data impact model performance? In this work, we specifically investigate how protein LLMs (pLMs) scale with increasing pretraining data. We investigate this relationship by measuring the performance of protein function prediction on a suite of pLMs pretrained on yearly snapshots of UniRef100 from 2011 to 2024. We find no evidence of model saturation on this task: performance improves--but not monotonically--with added data, and this trend differs between unsupervised and supervised experiments. Using a well-characterized Beta-Lactamase protein from E. coli, we find that unsupervised model predictions get better year-over-year, though they do not yet consistently perform better than the supervised baseline. Our results underscore the need for targeted data acquisition and deeper study of data scaling in protein modeling. All training, inference, analysis, and visualization code is available at: https://github.com/Align-to-Innovate/data-saturation-and-scaling.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.