Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
175 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Weighted information and entropy rates (1612.09169v1)

Published 29 Dec 2016 in cs.IT, math.IT, and math.PR

Abstract: The weighted entropy $H{\rm w}\phi (X)=H{\rm w}\phi (f)$ of a random variable $X$ with values $x$ and a probability-mass/density function $f$ is defined as the mean value ${\mathbb E} I{\rm w}\phi(X)$ of the weighted information $I{\rm w}\phi (x)=-\phi (x)\log\,f(x)$. Here $x\mapsto\phi (x)\in{\mathbb R}$ is a given weight function (WF) indicating a 'value' of outcome $x$. For an $n$-component random vector ${\mathbf{X}}0{n-1}=(X_0,\ldots ,X{n-1})$ produced by a random process ${\mathbf{X}}=(X_i,i\in{\mathbb Z})$, the weighted information $I{\rm w}{\phi_n}({\mathbf x}_0{n-1})$ and weighted entropy $H{\rm w}{\phi_n}({\mathbf{X}}0{n-1})$ are defined similarly, with an WF $\phi_n({\mathbf x}_0{n-1})$. Two types of WFs $\phi_n$ are considered, based on additive and a multiplicative forms ($\phi_n({\mathbf x}_0{n-1})=\sum\limits{i=0}{n-1}{\varphi} (x_i)$ and $\phi_n({\mathbf x}0{n-1})=\prod\limits{i=0}{n-1}{\varphi} (x_i)$, respectively). The focus is upon ${\it rates}$ of the weighted entropy and information, regarded as parameters related to ${\mathbf{X}}$. We show that, in the context of ergodicity, a natural scale for an asymptotically additive/multiplicative WF is $\frac{1}{n2}H{\rm w}{\phi_n}({\mathbf{X}}_0{n-1})$ and $\frac{1}{n}\log\;H{\rm w}{\phi_n}({\mathbf{X}}_0{n-1})$, respectively. This gives rise to ${\it primary}$ ${\it rates}$. The next-order terms can also be identified, leading to ${\it secondary}$ ${\it rates}$. We also consider emerging generalisations of the Shannon-McMillan-Breiman theorem.

Citations (4)

Summary

We haven't generated a summary for this paper yet.