Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Universal Bayesian Measures and Universal Histogram Sequences (1405.6033v1)

Published 23 May 2014 in cs.IT and math.IT

Abstract: Consider universal data compression: the length $l(xn)$ of sequence $xn\in An$ with finite alphabet $A$ and length $n$ satisfies Kraft's inequality over $An$, and $-\frac{1}{n}\log \frac{Pn(xn)}{Qn(xn)}$ almost surely converges to zero as $n$ grows for the $Qn(xn)=2{-l(xn)}$ and any stationary ergodic source $P$. In this paper, we say such a $Q$ is a universal Bayesian measure. We generalize the notion to the sources in which the random variables may be either discrete, continuous, or none of them. The basic idea is due to Boris Ryabko who utilized model weighting over histograms that approximate $P$, assuming that a density function of $P$ exists. However, the range of $P$ depends on the choice of the histogram sequence. The universal Bayesian measure constructed in this paper overcomes the drawbacks and has many applications to infer relation among random variables, and extends the application area of the minimum description length principle.

Summary

We haven't generated a summary for this paper yet.