Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
38 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Towards Effective and Efficient Continual Pre-training of Large Language Models (2407.18743v1)

Published 26 Jul 2024 in cs.CL

Abstract: Continual pre-training (CPT) has been an important approach for adapting LLMs to specific domains or tasks. To make the CPT approach more traceable, this paper presents a technical report for continually pre-training Llama-3 (8B), which significantly enhances the Chinese language ability and scientific reasoning ability of the backbone model. To enhance the new abilities while retaining the original abilities, we design specific data mixture and curriculum strategies by utilizing existing datasets and synthesizing high-quality datasets. Specifically, we synthesize multidisciplinary scientific question and answer (QA) pairs based on related web pages, and subsequently incorporate these synthetic data to improve the scientific reasoning ability of Llama-3. We refer to the model after CPT as Llama-3-SynE (Synthetic data Enhanced Llama-3). We also present the tuning experiments with a relatively small model -- TinyLlama, and employ the derived findings to train the backbone model. Extensive experiments on a number of evaluation benchmarks show that our approach can largely improve the performance of the backbone models, including both the general abilities (+8.81 on C-Eval and +6.31 on CMMLU) and the scientific reasoning abilities (+12.00 on MATH and +4.13 on SciEval), without hurting the original capacities. Our model, data, and codes are available at https://github.com/RUC-GSAI/Llama-3-SynE.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (19)
  1. Jie Chen (602 papers)
  2. Zhipeng Chen (46 papers)
  3. Jiapeng Wang (22 papers)
  4. Kun Zhou (217 papers)
  5. Yutao Zhu (63 papers)
  6. Jinhao Jiang (25 papers)
  7. Yingqian Min (14 papers)
  8. Wayne Xin Zhao (196 papers)
  9. Zhicheng Dou (113 papers)
  10. Jiaxin Mao (47 papers)
  11. Yankai Lin (125 papers)
  12. Ruihua Song (48 papers)
  13. Jun Xu (397 papers)
  14. Xu Chen (413 papers)
  15. Rui Yan (250 papers)
  16. Zhewei Wei (68 papers)
  17. Di Hu (88 papers)
  18. Wenbing Huang (95 papers)
  19. Ji-Rong Wen (299 papers)
Citations (3)
Youtube Logo Streamline Icon: https://streamlinehq.com