Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
134 tokens/sec
GPT-4o
10 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Radon Sobolev Variational Auto-Encoders (1911.13135v3)

Published 29 Nov 2019 in cs.LG, cs.AI, cs.CV, and cs.NE

Abstract: The quality of generative models (such as Generative adversarial networks and Variational Auto-Encoders) depends heavily on the choice of a good probability distance. However some popular metrics like the Wasserstein or the Sliced Wasserstein distances, the Jensen-Shannon divergence, the Kullback-Leibler divergence, lack convenient properties such as (geodesic) convexity, fast evaluation and so on. To address these shortcomings, we introduce a class of distances that have built-in convexity. We investigate the relationship with some known paradigms (sliced distances - a synonym for Radon distances -, reproducing kernel Hilbert spaces, energy distances). The distances are shown to possess fast implementations and are included in an adapted Variational Auto-Encoder termed Radon Sobolev Variational Auto-Encoder (RS-VAE) which produces high quality results on standard generative datasets. Keywords: Variational Auto-Encoder; Generative model; Sobolev spaces; Radon Sobolev Variational Auto-Encoder;

Citations (3)

Summary

We haven't generated a summary for this paper yet.