Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Lifelong Pretraining: Continually Adapting Language Models to Emerging Corpora (2110.08534v3)

Published 16 Oct 2021 in cs.CL

Abstract: Pretrained LLMs (PTLMs) are typically learned over a large, static corpus and further fine-tuned for various downstream tasks. However, when deployed in the real world, a PTLM-based model must deal with data distributions that deviate from what the PTLM was initially trained on. In this paper, we study a lifelong LLM pretraining challenge where a PTLM is continually updated so as to adapt to emerging data. Over a domain-incremental research paper stream and a chronologically-ordered tweet stream, we incrementally pretrain a PTLM with different continual learning algorithms, and keep track of the downstream task performance (after fine-tuning). We evaluate PTLM's ability to adapt to new corpora while retaining learned knowledge in earlier corpora. Our experiments show distillation-based approaches to be most effective in retaining downstream performance in earlier domains. The algorithms also improve knowledge transfer, allowing models to achieve better downstream performance over the latest data, and improve temporal generalization when distribution gaps exist between training and evaluation because of time. We believe our problem formulation, methods, and analysis will inspire future studies towards continual pretraining of LLMs.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Xisen Jin (14 papers)
  2. Dejiao Zhang (20 papers)
  3. Henghui Zhu (24 papers)
  4. Wei Xiao (100 papers)
  5. Shang-Wen Li (55 papers)
  6. Xiaokai Wei (14 papers)
  7. Andrew Arnold (14 papers)
  8. Xiang Ren (194 papers)
Citations (101)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com