Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Language Generation with Recurrent Generative Adversarial Networks without Pre-training (1706.01399v3)

Published 5 Jun 2017 in cs.CL

Abstract: Generative Adversarial Networks (GANs) have shown great promise recently in image generation. Training GANs for language generation has proven to be more difficult, because of the non-differentiable nature of generating text with recurrent neural networks. Consequently, past work has either resorted to pre-training with maximum-likelihood or used convolutional networks for generation. In this work, we show that recurrent neural networks can be trained to generate text with GANs from scratch using curriculum learning, by slowly teaching the model to generate sequences of increasing and variable length. We empirically show that our approach vastly improves the quality of generated sequences compared to a convolutional baseline.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Ofir Press (21 papers)
  2. Amir Bar (31 papers)
  3. Ben Bogin (22 papers)
  4. Jonathan Berant (107 papers)
  5. Lior Wolf (217 papers)
Citations (103)