Papers
Topics
Authors
Recent
Search
2000 character limit reached

Kullback-Leibler Divergence for the Normal-Gamma Distribution

Published 4 Nov 2016 in math.ST, q-bio.NC, and stat.TH | (1611.01437v1)

Abstract: We derive the Kullback-Leibler divergence for the normal-gamma distribution and show that it is identical to the Bayesian complexity penalty for the univariate general linear model with conjugate priors. Based on this finding, we provide two applications of the KL divergence, one in simulated and one in empirical data.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.