Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

GPT-too: A language-model-first approach for AMR-to-text generation (2005.09123v2)

Published 18 May 2020 in cs.CL and cs.LG

Abstract: Meaning Representations (AMRs) are broad-coverage sentence-level semantic graphs. Existing approaches to generating text from AMR have focused on training sequence-to-sequence or graph-to-sequence models on AMR annotated data only. In this paper, we propose an alternative approach that combines a strong pre-trained LLM with cycle consistency-based re-scoring. Despite the simplicity of the approach, our experimental results show these models outperform all previous techniques on the English LDC2017T10dataset, including the recent use of transformer architectures. In addition to the standard evaluation metrics, we provide human evaluation experiments that further substantiate the strength of our approach.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Manuel Mager (15 papers)
  2. Tahira Naseem (27 papers)
  3. Md Arafat Sultan (25 papers)
  4. Young-Suk Lee (17 papers)
  5. Radu Florian (54 papers)
  6. Salim Roukos (41 papers)
  7. Ramon Fernandez Astudillo (11 papers)
Citations (93)