Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Variational Representations and Neural Network Estimation of Rényi Divergences (2007.03814v4)

Published 7 Jul 2020 in stat.ML, cs.IT, cs.LG, math.IT, and math.PR

Abstract: We derive a new variational formula for the R\'enyi family of divergences, $R_\alpha(Q|P)$, between probability measures $Q$ and $P$. Our result generalizes the classical Donsker-Varadhan variational formula for the Kullback-Leibler divergence. We further show that this R\'enyi variational formula holds over a range of function spaces; this leads to a formula for the optimizer under very weak assumptions and is also key in our development of a consistency theory for R\'enyi divergence estimators. By applying this theory to neural-network estimators, we show that if a neural network family satisfies one of several strengthened versions of the universal approximation property then the corresponding R\'enyi divergence estimator is consistent. In contrast to density-estimator based methods, our estimators involve only expectations under $Q$ and $P$ and hence are more effective in high dimensional systems. We illustrate this via several numerical examples of neural network estimation in systems of up to 5000 dimensions.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Jeremiah Birrell (41 papers)
  2. Paul Dupuis (40 papers)
  3. Markos A. Katsoulakis (49 papers)
  4. Luc Rey-Bellet (38 papers)
  5. Jie Wang (481 papers)
Citations (29)

Summary

We haven't generated a summary for this paper yet.