Papers
Topics
Authors
Recent
2000 character limit reached

Kernelized Complete Conditional Stein Discrepancy

Published 9 Apr 2019 in stat.ML and cs.LG | (1904.04478v4)

Abstract: Much of machine learning relies on comparing distributions with discrepancy measures. Stein's method creates discrepancy measures between two distributions that require only the unnormalized density of one and samples from the other. Stein discrepancies can be combined with kernels to define kernelized Stein discrepancies (KSDs). While kernels make Stein discrepancies tractable, they pose several challenges in high dimensions. We introduce kernelized complete conditional Stein discrepancies (KCC-SDs). Complete conditionals turn a multivariate distribution into multiple univariate distributions. We show that KCC-SDs distinguish distributions. To show the efficacy of KCC-SDs in distinguishing distributions, we introduce a goodness-of-fit test using KCC-SDs. We empirically show that KCC-SDs have higher power over baselines and use KCC-SDs to assess sample quality in Markov chain Monte Carlo.

Citations (7)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.