Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Divergence Formula for Randomness and Dimension (Short Version) (0906.4162v1)

Published 23 Jun 2009 in cs.CC, cs.IT, and math.IT

Abstract: If $S$ is an infinite sequence over a finite alphabet $\Sigma$ and $\beta$ is a probability measure on $\Sigma$, then the {\it dimension} of $ S$ with respect to $\beta$, written $\dim\beta(S)$, is a constructive version of Billingsley dimension that coincides with the (constructive Hausdorff) dimension $\dim(S)$ when $\beta$ is the uniform probability measure. This paper shows that $\dim\beta(S)$ and its dual $\Dim\beta(S)$, the {\it strong dimension} of $S$ with respect to $\beta$, can be used in conjunction with randomness to measure the similarity of two probability measures $\alpha$ and $\beta$ on $\Sigma$. Specifically, we prove that the {\it divergence formula} $$\dim\beta(R) = \Dim\beta(R) =\CH(\alpha) / (\CH(\alpha) + \D(\alpha || \beta))$$ holds whenever $\alpha$ and $\beta$ are computable, positive probability measures on $\Sigma$ and $R \in \Sigma\infty$ is random with respect to $\alpha$. In this formula, $\CH(\alpha)$ is the Shannon entropy of $\alpha$, and $\D(\alpha||\beta)$ is the Kullback-Leibler divergence between $\alpha$ and $\beta$.

Summary

We haven't generated a summary for this paper yet.