Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Select and Attend: Towards Controllable Content Selection in Text Generation (1909.04453v1)

Published 10 Sep 2019 in cs.CL

Abstract: Many text generation tasks naturally contain two steps: content selection and surface realization. Current neural encoder-decoder models conflate both steps into a black-box architecture. As a result, the content to be described in the text cannot be explicitly controlled. This paper tackles this problem by decoupling content selection from the decoder. The decoupled content selection is human interpretable, whose value can be manually manipulated to control the content of generated text. The model can be trained end-to-end without human annotations by maximizing a lower bound of the marginal likelihood. We further propose an effective way to trade-off between performance and controllability with a single adjustable hyperparameter. In both data-to-text and headline generation tasks, our model achieves promising results, paving the way for controllable content selection in text generation.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Xiaoyu Shen (73 papers)
  2. Jun Suzuki (86 papers)
  3. Kentaro Inui (119 papers)
  4. Hui Su (38 papers)
  5. Dietrich Klakow (114 papers)
  6. Satoshi Sekine (11 papers)
Citations (28)