Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
38 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

UltraLink: An Open-Source Knowledge-Enhanced Multilingual Supervised Fine-tuning Dataset (2402.04588v2)

Published 7 Feb 2024 in cs.CL

Abstract: Open-source LLMs have gained significant strength across diverse fields. Nevertheless, the majority of studies primarily concentrate on English, with only limited exploration into the realm of multilingual abilities. In this work, we therefore construct an open-source multilingual supervised fine-tuning dataset. Different from previous works that simply translate English instructions, we consider both the language-specific and language-agnostic abilities of LLMs. Firstly, we introduce a knowledge-grounded data augmentation approach to elicit more language-specific knowledge of LLMs, improving their ability to serve users from different countries. Moreover, we find modern LLMs possess strong cross-lingual transfer capabilities, thus repeatedly learning identical content in various languages is not necessary. Consequently, we can substantially prune the language-agnostic supervised fine-tuning (SFT) data without any performance degradation, making multilingual SFT more efficient. The resulting UltraLink dataset comprises approximately 1 million samples across five languages (i.e., En, Zh, Ru, Fr, Es), and the proposed data construction method can be easily extended to other languages. UltraLink-LM, which is trained on UltraLink, outperforms several representative baselines across many tasks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (12)
  1. Haoyu Wang (309 papers)
  2. Shuo Wang (382 papers)
  3. Yukun Yan (39 papers)
  4. Xujia Wang (3 papers)
  5. Zhiyu Yang (14 papers)
  6. Yuzhuang Xu (12 papers)
  7. Zhenghao Liu (77 papers)
  8. Ning Ding (122 papers)
  9. Xu Han (270 papers)
  10. Zhiyuan Liu (433 papers)
  11. Maosong Sun (337 papers)
  12. Liner Yang (22 papers)
Github Logo Streamline Icon: https://streamlinehq.com
X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets