Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On how generalised entropies without parameters impact information optimisation processes (2103.11759v1)

Published 19 Jan 2021 in cs.IT, cond-mat.stat-mech, and math.IT

Abstract: As an application of generalised statistical mechanics, it is studied a possible route toward a consistent generalised information theory in terms of a family of non-extensive, non-parametric entropies $H\pm_D(P)$. Unlike other proposals based on non-extensive entropies with a parameter dependence, our scheme is asymptotically equivalent to the one formulated by Shannon, while it differs in regions where the density of states is reasonably small, which leads to information distributions constrained to their background. Two basic concepts are discussed to this aim. First, we prove two effective coding theorems for the entropies $H\pm_D(P)$. Then we calculate the channel capacity of a binary symmetric channel (BSC) and a binary erasure channel (BEC) in terms of these entropies. We found that processes such as data compression and channel capacity maximisation can be improved in regions where there is a low density of states, whereas for high densities our results coincide with Shannon's formulation.

Citations (1)

Summary

We haven't generated a summary for this paper yet.