Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Improving Adversarial Text Generation by Modeling the Distant Future (2005.01279v1)

Published 4 May 2020 in cs.CL and cs.LG

Abstract: Auto-regressive text generation models usually focus on local fluency, and may cause inconsistent semantic meaning in long text generation. Further, automatically generating words with similar semantics is challenging, and hand-crafted linguistic rules are difficult to apply. We consider a text planning scheme and present a model-based imitation-learning approach to alleviate the aforementioned issues. Specifically, we propose a novel guider network to focus on the generative process over a longer horizon, which can assist next-word prediction and provide intermediate rewards for generator optimization. Extensive experiments demonstrate that the proposed method leads to improved performance.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Ruiyi Zhang (98 papers)
  2. Changyou Chen (108 papers)
  3. Zhe Gan (135 papers)
  4. Wenlin Wang (27 papers)
  5. Dinghan Shen (34 papers)
  6. Guoyin Wang (108 papers)
  7. Zheng Wen (73 papers)
  8. Lawrence Carin (203 papers)
Citations (12)

Summary

We haven't generated a summary for this paper yet.