Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
120 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Hilberg Exponents: New Measures of Long Memory in the Process (1403.1757v3)

Published 7 Mar 2014 in cs.IT and math.IT

Abstract: The paper concerns the rates of power-law growth of mutual information computed for a stationary measure or for a universal code. The rates are called Hilberg exponents and four such quantities are defined for each measure and each code: two random exponents and two expected exponents. A particularly interesting case arises for conditional algorithmic mutual information. In this case, the random Hilberg exponents are almost surely constant on ergodic sources and are bounded by the expected Hilberg exponents. This property is a "second-order" analogue of the Shannon-McMillan-Breiman theorem, proved without invoking the ergodic theorem. It carries over to Hilberg exponents for the underlying probability measure via Shannon-Fano coding and Barron inequality. Moreover, the expected Hilberg exponents can be linked for different universal codes. Namely, if one code dominates another, the expected Hilberg exponents are greater for the former than for the latter. The paper is concluded by an evaluation of Hilberg exponents for certain sources such as the mixture Bernoulli process and the Santa Fe processes.

Citations (8)

Summary

We haven't generated a summary for this paper yet.