Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

An Empirical Comparison of GANs and Normalizing Flows for Density Estimation (2006.10175v2)

Published 17 Jun 2020 in cs.LG and stat.ML

Abstract: Generative adversarial networks (GANs) and normalizing flows are both approaches to density estimation that use deep neural networks to transform samples from an uninformative prior distribution to an approximation of the data distribution. There is great interest in both for general-purpose statistical modeling, but the two approaches have seldom been compared to each other for modeling non-image data. The difficulty of computing likelihoods with GANs, which are implicit models, makes conducting such a comparison challenging. We work around this difficulty by considering several low-dimensional synthetic datasets. An extensive grid search over GAN architectures, hyperparameters, and training procedures suggests that no GAN is capable of modeling our simple low-dimensional data well, a task we view as a prerequisite for an approach to be considered suitable for general-purpose statistical modeling. Several normalizing flows, on the other hand, excelled at these tasks, even substantially outperforming WGAN in terms of Wasserstein distance -- the metric that WGAN alone targets. Scientists and other practitioners should be wary of relying on WGAN for applications that require accurate density estimation.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Tianci Liu (18 papers)
  2. Jeffrey Regier (29 papers)
Citations (4)

Summary

We haven't generated a summary for this paper yet.