Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Controlled Text Generation for Black-box Language Models via Score-based Progressive Editor (2311.07430v2)

Published 13 Nov 2023 in cs.CL

Abstract: Controlled text generation is very important for the practical use of LLMs because it ensures that the produced text includes only the desired attributes from a specific domain or dataset. Existing methods, however, are inapplicable to black-box models or suffer a significant trade-off between controlling the generated text and maintaining its fluency. This paper introduces the Score-based Progressive Editor (ScoPE), a novel approach designed to overcome these issues. ScoPE modifies the context at the token level during the generation process of a backbone LLM. This modification guides the subsequent text to naturally include the target attributes. To facilitate this process, ScoPE employs a training objective that maximizes a target score, thoroughly considering both the ability to guide the text and its fluency. Experimental results on diverse controlled generation tasks demonstrate that ScoPE can effectively regulate the attributes of the generated text while fully utilizing the capability of the backbone LLMs. Our codes are available at \url{https://github.com/ysw1021/ScoPE}.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Sangwon Yu (8 papers)
  2. Changmin Lee (26 papers)
  3. Hojin Lee (14 papers)
  4. Sungroh Yoon (163 papers)

Summary

We haven't generated a summary for this paper yet.