Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
157 tokens/sec
GPT-4o
43 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On typical encodings of multivariate ergodic sources (1906.02570v2)

Published 6 Jun 2019 in cs.IT and math.IT

Abstract: We show that the typical coordinate-wise encoding of multivariate ergodic source into prescribed alphabets has the entropy profile close to the convolution of the entropy profile of the source and the modular polymatroid that is determined by the cardinalities of the output alphabets. We show that the proportion of the exceptional encodings that are not close to the convolution goes to zero doubly exponentially. The result holds for a class of multivariate sources that satisfy asymptotic equipartition property described via the mean fluctuation of the information functions. This class covers asymptotically mean stationary processes with ergodic mean, ergodic processes, irreducible Markov chains with an arbitrary initial distribution. We also proved that typical encodings yield the asymptotic equipartition property for the output variables. These asymptotic results are based on an explicit lower bound of the proportion of encodings that transform a multivariate random variable into a variable with the entropy profile close to the suitable convolution.

Summary

We haven't generated a summary for this paper yet.