2000 character limit reached
Improving Adversarial Text Generation by Modeling the Distant Future (2005.01279v1)
Published 4 May 2020 in cs.CL and cs.LG
Abstract: Auto-regressive text generation models usually focus on local fluency, and may cause inconsistent semantic meaning in long text generation. Further, automatically generating words with similar semantics is challenging, and hand-crafted linguistic rules are difficult to apply. We consider a text planning scheme and present a model-based imitation-learning approach to alleviate the aforementioned issues. Specifically, we propose a novel guider network to focus on the generative process over a longer horizon, which can assist next-word prediction and provide intermediate rewards for generator optimization. Extensive experiments demonstrate that the proposed method leads to improved performance.
- Ruiyi Zhang (98 papers)
- Changyou Chen (108 papers)
- Zhe Gan (135 papers)
- Wenlin Wang (27 papers)
- Dinghan Shen (34 papers)
- Guoyin Wang (108 papers)
- Zheng Wen (73 papers)
- Lawrence Carin (203 papers)