Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

TextKD-GAN: Text Generation using KnowledgeDistillation and Generative Adversarial Networks (1905.01976v1)

Published 23 Apr 2019 in cs.CL

Abstract: Text generation is of particular interest in many NLP applications such as machine translation, LLMing, and text summarization. Generative adversarial networks (GANs) achieved a remarkable success in high quality image generation in computer vision,and recently, GANs have gained lots of interest from the NLP community as well. However, achieving similar success in NLP would be more challenging due to the discrete nature of text. In this work, we introduce a method using knowledge distillation to effectively exploit GAN setup for text generation. We demonstrate how autoencoders (AEs) can be used for providing a continuous representation of sentences, which is a smooth representation that assign non-zero probabilities to more than one word. We distill this representation to train the generator to synthesize similar smooth representations. We perform a number of experiments to validate our idea using different datasets and show that our proposed approach yields better performance in terms of the BLEU score and Jensen-Shannon distance (JSD) measure compared to traditional GAN-based text generation approaches without pre-training.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Md. Akmal Haidar (12 papers)
  2. Mehdi Rezagholizadeh (78 papers)
Citations (50)

Summary

We haven't generated a summary for this paper yet.