Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Towards Verifiable Text Generation with Symbolic References (2311.09188v2)

Published 15 Nov 2023 in cs.CL, cs.AI, and cs.LG

Abstract: LLMs are vulnerable to hallucinations, and thus their outputs generally require laborious human verification for high-stakes applications. To this end, we propose symbolically grounded generation (SymGen) as a simple approach for enabling easier manual validation of an LLM's output. SymGen prompts an LLM to interleave its regular output text with explicit symbolic references to fields present in some conditioning data (e.g., a table in JSON format). The references can be used to display the provenance of different spans of text in the generation, reducing the effort required for manual verification. Across a range of data-to-text and question-answering experiments, we find that LLMs are able to directly output text that makes use of accurate symbolic references while maintaining fluency and factuality. In a human study we further find that such annotations can streamline human verification of machine-generated text. Our code will be available at http://symgen.github.io.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Lucas Torroba Hennigen (14 papers)
  2. Shannon Shen (2 papers)
  3. Aniruddha Nrusimha (8 papers)
  4. Bernhard Gapp (1 paper)
  5. David Sontag (95 papers)
  6. Yoon Kim (92 papers)
Citations (7)
Github Logo Streamline Icon: https://streamlinehq.com