Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

QURIOUS: Question Generation Pretraining for Text Generation (2004.11026v1)

Published 23 Apr 2020 in cs.CL

Abstract: Recent trends in natural language processing using pretraining have shifted focus towards pretraining and fine-tuning approaches for text generation. Often the focus has been on task-agnostic approaches that generalize the LLMing objective. We propose question generation as a pretraining method, which better aligns with the text generation objectives. Our text generation models pretrained with this method are better at understanding the essence of the input and are better LLMs for the target task. When evaluated on two text generation tasks, abstractive summarization and answer-focused question generation, our models result in state-of-the-art performances in terms of automatic metrics. Human evaluators also found our summaries and generated questions to be more natural, concise and informative.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Shashi Narayan (35 papers)
  2. Ji Ma (72 papers)
  3. Hannah Craighead (1 paper)
  4. Ryan McDonald (24 papers)
  5. Gonçalo Simoes (2 papers)
Citations (15)