Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Quest: Query-centric Data Synthesis Approach for Long-context Scaling of Large Language Model (2405.19846v6)

Published 30 May 2024 in cs.CL and cs.AI

Abstract: Recent advancements in LLMs have highlighted the importance of extending context lengths for handling complex tasks. While traditional methods for training on long contexts often use filtered long documents, these approaches lead to domain imbalances, limiting model performance. To address this, techniques like random document concatenation (Standard) and similarity-based methods (KNN, ICLM) have been developed. However, they either sacrifice semantic coherence or diversity. To balance both aspects, we introduce Quest, a query-centric data synthesis method aggregating semantically relevant yet diverse documents. Quest uses a generative model to predict potential queries for each document, grouping documents with similar queries and keywords. Extensive experiments demonstrate Quest's superior performance on long-context tasks, achieving remarkable results with context lengths of up to 1M tokens and confirming its scalability across various model sizes.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Chaochen Gao (10 papers)
  2. Xing Wu (69 papers)
  3. Qi Fu (7 papers)
  4. Songlin Hu (80 papers)
Citations (3)
X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets