Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Towards Verifiable Text Generation with Evolving Memory and Self-Reflection (2312.09075v3)

Published 14 Dec 2023 in cs.CL

Abstract: Despite the remarkable ability of LLMs in language comprehension and generation, they often suffer from producing factually incorrect information, also known as hallucination. A promising solution to this issue is verifiable text generation, which prompts LLMs to generate content with citations for accuracy verification. However, verifiable text generation is non-trivial due to the focus-shifting phenomenon, the intricate reasoning needed to align the claim with correct citations, and the dilemma between the precision and breadth of retrieved documents. In this paper, we present VTG, an innovative framework for Verifiable Text Generation with evolving memory and self-reflection. VTG introduces evolving long short-term memory to retain both valuable documents and recent documents. A two-tier verifier equipped with an evidence finder is proposed to rethink and reflect on the relationship between the claim and citations. Furthermore, active retrieval and diverse query generation are utilized to enhance both the precision and breadth of the retrieved documents. We conduct extensive experiments on five datasets across three knowledge-intensive tasks and the results reveal that VTG significantly outperforms baselines.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Hao Sun (383 papers)
  2. Hengyi Cai (20 papers)
  3. Bo Wang (823 papers)
  4. Yingyan Hou (9 papers)
  5. Xiaochi Wei (12 papers)
  6. Shuaiqiang Wang (68 papers)
  7. Yan Zhang (954 papers)
  8. Dawei Yin (165 papers)
Citations (7)