Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Constrained Text Generation with Global Guidance -- Case Study on CommonGen (2103.07170v1)

Published 12 Mar 2021 in cs.CL

Abstract: This paper studies constrained text generation, which is to generate sentences under certain pre-conditions. We focus on CommonGen, the task of generating text based on a set of concepts, as a representative task of constrained text generation. Traditional methods mainly rely on supervised training to maximize the likelihood of target sentences.However, global constraints such as common sense and coverage cannot be incorporated into the likelihood objective of the autoregressive decoding process. In this paper, we consider using reinforcement learning to address the limitation, measuring global constraints including fluency, common sense and concept coverage with a comprehensive score, which serves as the reward for reinforcement learning. Besides, we design a guided decoding method at the word, fragment and sentence levels. Experiments demonstrate that our method significantly increases the concept coverage and outperforms existing models in various automatic evaluations.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Yixian Liu (4 papers)
  2. Liwen Zhang (34 papers)
  3. Wenjuan Han (36 papers)
  4. Yue Zhang (620 papers)
  5. Kewei Tu (74 papers)
Citations (9)

Summary

We haven't generated a summary for this paper yet.