On how generalised entropies without parameters impact information optimisation processes (2103.11759v1)
Abstract: As an application of generalised statistical mechanics, it is studied a possible route toward a consistent generalised information theory in terms of a family of non-extensive, non-parametric entropies $H\pm_D(P)$. Unlike other proposals based on non-extensive entropies with a parameter dependence, our scheme is asymptotically equivalent to the one formulated by Shannon, while it differs in regions where the density of states is reasonably small, which leads to information distributions constrained to their background. Two basic concepts are discussed to this aim. First, we prove two effective coding theorems for the entropies $H\pm_D(P)$. Then we calculate the channel capacity of a binary symmetric channel (BSC) and a binary erasure channel (BEC) in terms of these entropies. We found that processes such as data compression and channel capacity maximisation can be improved in regions where there is a low density of states, whereas for high densities our results coincide with Shannon's formulation.