2000 character limit reached
Kullback-Leibler Divergence for the Normal-Gamma Distribution
Published 4 Nov 2016 in math.ST, q-bio.NC, and stat.TH | (1611.01437v1)
Abstract: We derive the Kullback-Leibler divergence for the normal-gamma distribution and show that it is identical to the Bayesian complexity penalty for the univariate general linear model with conjugate priors. Based on this finding, we provide two applications of the KL divergence, one in simulated and one in empirical data.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.