Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

CoCon: A Self-Supervised Approach for Controlled Text Generation (2006.03535v3)

Published 5 Jun 2020 in cs.CL, cs.LG, and cs.NE

Abstract: Pretrained Transformer-based LLMs (LMs) display remarkable natural language generation capabilities. With their immense potential, controlling text generation of such LMs is getting attention. While there are studies that seek to control high-level attributes (such as sentiment and topic) of generated text, there is still a lack of more precise control over its content at the word- and phrase-level. Here, we propose Content-Conditioner (CoCon) to control an LM's output text with a content input, at a fine-grained level. In our self-supervised approach, the CoCon block learns to help the LM complete a partially-observed text sequence by conditioning with content inputs that are withheld from the LM. Through experiments, we show that CoCon can naturally incorporate target content into generated texts and control high-level text attributes in a zero-shot manner.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Alvin Chan (15 papers)
  2. Yew-Soon Ong (105 papers)
  3. Bill Pung (1 paper)
  4. Aston Zhang (48 papers)
  5. Jie Fu (229 papers)
Citations (81)

Summary

We haven't generated a summary for this paper yet.