Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Online Back-Parsing for AMR-to-Text Generation (2010.04520v1)

Published 9 Oct 2020 in cs.CL

Abstract: AMR-to-text generation aims to recover a text containing the same meaning as an input AMR graph. Current research develops increasingly powerful graph encoders to better represent AMR graphs, with decoders based on standard LLMing being used to generate outputs. We propose a decoder that back predicts projected AMR graphs on the target sentence during text generation. As the result, our outputs can better preserve the input meaning than standard decoders. Experiments on two AMR benchmarks show the superiority of our model over the previous state-of-the-art system based on graph Transformer.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Xuefeng Bai (34 papers)
  2. Linfeng Song (76 papers)
  3. Yue Zhang (620 papers)
Citations (17)

Summary

We haven't generated a summary for this paper yet.