Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Plan-then-Generate: Controlled Data-to-Text Generation via Planning (2108.13740v1)

Published 31 Aug 2021 in cs.CL

Abstract: Recent developments in neural networks have led to the advance in data-to-text generation. However, the lack of ability of neural models to control the structure of generated output can be limiting in certain real-world applications. In this study, we propose a novel Plan-then-Generate (PlanGen) framework to improve the controllability of neural data-to-text models. Extensive experiments and analyses are conducted on two benchmark datasets, ToTTo and WebNLG. The results show that our model is able to control both the intra-sentence and inter-sentence structure of the generated output. Furthermore, empirical comparisons against previous state-of-the-art methods show that our model improves the generation quality as well as the output diversity as judged by human and automatic evaluations.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Yixuan Su (35 papers)
  2. David Vandyke (18 papers)
  3. Sihui Wang (12 papers)
  4. Yimai Fang (4 papers)
  5. Nigel Collier (83 papers)
Citations (77)