Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Bayesian Characterization of Relative Entropy (1402.3067v2)

Published 13 Feb 2014 in cs.IT, math-ph, math.IT, math.MP, math.PR, and quant-ph

Abstract: We give a new characterization of relative entropy, also known as the Kullback-Leibler divergence. We use a number of interesting categories related to probability theory. In particular, we consider a category FinStat where an object is a finite set equipped with a probability distribution, while a morphism is a measure-preserving function $f: X \to Y$ together with a stochastic right inverse $s: Y \to X$. The function $f$ can be thought of as a measurement process, while s provides a hypothesis about the state of the measured system given the result of a measurement. Given this data we can define the entropy of the probability distribution on $X$ relative to the "prior" given by pushing the probability distribution on $Y$ forwards along $s$. We say that $s$ is "optimal" if these distributions agree. We show that any convex linear, lower semicontinuous functor from FinStat to the additive monoid $[0,\infty]$ which vanishes when $s$ is optimal must be a scalar multiple of this relative entropy. Our proof is independent of all earlier characterizations, but inspired by the work of Petz.

Citations (60)

Summary

We haven't generated a summary for this paper yet.