2000 character limit reached
Text Generation with Deep Variational GAN (2104.13488v1)
Published 27 Apr 2021 in cs.LG, cs.AI, and cs.CL
Abstract: Generating realistic sequences is a central task in many machine learning applications. There has been considerable recent progress on building deep generative models for sequence generation tasks. However, the issue of mode-collapsing remains a main issue for the current models. In this paper we propose a GAN-based generic framework to address the problem of mode-collapse in a principled approach. We change the standard GAN objective to maximize a variational lower-bound of the log-likelihood while minimizing the Jensen-Shanon divergence between data and model distributions. We experiment our model with text generation task and show that it can generate realistic text with high diversity.
- Mahmoud Hossam (6 papers)
- Trung Le (94 papers)
- Michael Papasimeon (6 papers)
- Viet Huynh (10 papers)
- Dinh Phung (147 papers)