Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

PLANET: Dynamic Content Planning in Autoregressive Transformers for Long-form Text Generation (2203.09100v1)

Published 17 Mar 2022 in cs.CL

Abstract: Despite recent progress of pre-trained LLMs on generating fluent text, existing methods still suffer from incoherence problems in long-form text generation tasks that require proper content control and planning to form a coherent high-level logical flow. In this work, we propose PLANET, a novel generation framework leveraging autoregressive self-attention mechanism to conduct content planning and surface realization dynamically. To guide the generation of output sentences, our framework enriches the Transformer decoder with latent representations to maintain sentence-level semantic plans grounded by bag-of-words. Moreover, we introduce a new coherence-based contrastive learning objective to further improve the coherence of output. Extensive experiments are conducted on two challenging long-form text generation tasks including counterargument generation and opinion article generation. Both automatic and human evaluations show that our method significantly outperforms strong baselines and generates more coherent texts with richer contents.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Zhe Hu (34 papers)
  2. Hou Pong Chan (36 papers)
  3. Jiachen Liu (45 papers)
  4. Xinyan Xiao (41 papers)
  5. Hua Wu (191 papers)
  6. Lifu Huang (92 papers)
Citations (37)