Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Generalizing From Short to Long: Effective Data Synthesis for Long-Context Instruction Tuning (2502.15592v1)

Published 21 Feb 2025 in cs.CL and cs.AI

Abstract: Long-context modelling for LLMs has been a key area of recent research because many real world use cases require reasoning over longer inputs such as documents. The focus of research into modelling long context has been on how to model position and there has been little investigation into other important aspects of LLMling such as instruction tuning. Long context training examples are challenging and expensive to create and use. In this paper, we investigate how to design instruction data for the post-training phase of a long context pre-trained model: how much and what type of context is needed for optimal and efficient post-training. Our controlled study reveals that models instruction-tuned on short contexts can effectively generalize to longer ones, while also identifying other critical factors such as instruction difficulty and context composition. Based on these findings, we propose context synthesis, a novel data synthesis framework that leverages off-the-shelf LLMs to generate extended background contexts for high-quality instruction-answer pairs. Experiment results on the document-level benchmark (LongBench) demonstrate that our proposed approach outperforms previous instruction synthesis approaches and comes close to the performance of human-annotated long-context instruction data. The project will be available at: https://github.com/NJUNLP/context-synthesis.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Wenhao Zhu (32 papers)
  2. Pinzhen Chen (27 papers)
  3. Hanxu Hu (9 papers)
  4. Shujian Huang (106 papers)
  5. Fei Yuan (28 papers)
  6. Jiajun Chen (125 papers)
  7. Alexandra Birch (67 papers)