Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

RetGen: A Joint framework for Retrieval and Grounded Text Generation Modeling (2105.06597v4)

Published 14 May 2021 in cs.CL and cs.AI

Abstract: Recent advances in large-scale pre-training such as GPT-3 allow seemingly high quality text to be generated from a given prompt. However, such generation systems often suffer from problems of hallucinated facts, and are not inherently designed to incorporate useful external information. Grounded generation models appear to offer remedies, but their training typically relies on rarely-available parallel data where information-relevant documents are provided for context. We propose a framework that alleviates this data constraint by jointly training a grounded generator and document retriever on the LLM signal. The model learns to reward retrieval of the documents with the highest utility in generation, and attentively combines them using a Mixture-of-Experts (MoE) ensemble to generate follow-on text. We demonstrate that both generator and retriever can take advantage of this joint training and work synergistically to produce more informative and relevant text in both prose and dialogue generation.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Yizhe Zhang (127 papers)
  2. Siqi Sun (46 papers)
  3. Xiang Gao (210 papers)
  4. Yuwei Fang (31 papers)
  5. Chris Brockett (37 papers)
  6. Michel Galley (50 papers)
  7. Jianfeng Gao (344 papers)
  8. Bill Dolan (45 papers)
Citations (25)