Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

GANs with Variational Entropy Regularizers: Applications in Mitigating the Mode-Collapse Issue (2009.11921v1)

Published 24 Sep 2020 in cs.LG, cs.IT, eess.SP, math.IT, and stat.ML

Abstract: Building on the success of deep learning, Generative Adversarial Networks (GANs) provide a modern approach to learn a probability distribution from observed samples. GANs are often formulated as a zero-sum game between two sets of functions; the generator and the discriminator. Although GANs have shown great potentials in learning complex distributions such as images, they often suffer from the mode collapse issue where the generator fails to capture all existing modes of the input distribution. As a consequence, the diversity of generated samples is lower than that of the observed ones. To tackle this issue, we take an information-theoretic approach and maximize a variational lower bound on the entropy of the generated samples to increase their diversity. We call this approach GANs with Variational Entropy Regularizers (GAN+VER). Existing remedies for the mode collapse issue in GANs can be easily coupled with our proposed variational entropy regularization. Through extensive experimentation on standard benchmark datasets, we show all the existing evaluation metrics highlighting difference of real and generated samples are significantly improved with GAN+VER.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Pirazh Khorramshahi (10 papers)
  2. Hossein Souri (12 papers)
  3. Rama Chellappa (190 papers)
  4. Soheil Feizi (127 papers)
Citations (4)