Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Controllable Natural Language Generation with Contrastive Prefixes (2202.13257v1)

Published 27 Feb 2022 in cs.CL

Abstract: To guide the generation of large pretrained LLMs (LM), previous work has focused on directly fine-tuning the LLM or utilizing an attribute discriminator. In this work, we propose a novel lightweight framework for controllable GPT2 generation, which utilizes a set of small attribute-specific vectors, called prefixes, to steer natural language generation. Different from prefix-tuning, where each prefix is trained independently, we take the relationship among prefixes into consideration and train multiple prefixes simultaneously. We propose a novel supervised method and also an unsupervised method to train the prefixes for single-aspect control while the combination of these two methods can achieve multi-aspect control. Experimental results on both single-aspect and multi-aspect control show that our methods can guide generation towards the desired attributes while keeping high linguistic quality.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Jing Qian (81 papers)
  2. Li Dong (154 papers)
  3. Yelong Shen (83 papers)
  4. Furu Wei (291 papers)
  5. Weizhu Chen (128 papers)
Citations (89)

Summary

We haven't generated a summary for this paper yet.