Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Asymptotic Expansions of Smooth Rényi Entropies and Their Applications (2003.05545v1)

Published 11 Mar 2020 in cs.IT and math.IT

Abstract: This study considers the unconditional smooth R\'{e}nyi entropy, the smooth conditional R\'{e}nyi entropy proposed by Kuzuoka [\emph{IEEE Trans.\ Inf.\ Theory}, vol.~66, no.~3, pp.~1674--1690, 2020], and a new quantity which we term the conditional smooth R\'{e}nyi entropy. In particular, we examine asymptotic expansions of these entropies when the underlying source with its side-information is stationary and memoryless. Using these smooth R\'{e}nyi entropies, we establish one-shot coding theorems of several information-theoretic problems: Campbell's source coding, guessing problems, and task encoding problems, all allowing errors. In each problem, we consider two error formalisms: the average and maximum error criteria, where the averaging and maximization are taken with respect to the side-information of the source. Applying our asymptotic expansions to the derived one-shot coding theorems, we derive various asymptotic fundamental limits for these problems when their error probabilities are allowed to be non-vanishing. We show that, in non-degenerate settings, the first-order fundamental limits differ under the average and maximum error criteria. This is in contrast to a different but related setting considered by the present authors (for variable-length conditional source coding allowing errors) in which the first-order terms are identical but the second-order terms are different under these criteria.

Citations (3)

Summary

We haven't generated a summary for this paper yet.