Mapping Entropies: A Unified Framework
- Mapping entropies are a unified formalism that employs formal group theory to map composition laws and inequalities across classical and generalized entropy measures.
- They systematically recover and connect classical (Boltzmann-Gibbs, Rényi) and nonclassical (Tsallis, Sharma-Mittal) entropies through specific generator function choices and limits.
- The framework provides practical insights into composability, extensivity, and phase-space scaling, with applications in statistical physics, quantum channels, and dynamical systems.
Mapping entropies unify the diverse families of entropy functionals and their corresponding operations across probability distributions, dynamical systems, and information-theoretic frameworks by mapping the composition laws and associated inequalities into a general and composable theoretical architecture. This concept is central both to the algebraic classification of entropy measures (e.g., via formal group theory and composability axioms) and to the computational and geometric mapping of entropic regions in information theory. Mapping entropies thereby provide a rigorous bridge connecting group-theoretic, geometric, and probabilistic structures, with ramifications in classical, quantum, and dynamical settings.
1. Formal Group-Theoretical Foundation and -Entropies
A structurally complete synthesis of entropy functionals is attained by mapping additivity (the fourth Shannon-Khinchin axiom) into the composability property, which generalizes ordinary additivity to require that the entropy of a composite system depends only on the entropies of and , via a binary “group law.” In the -entropy formalism (Tempesta, 2015), for any strictly increasing, continuous function admitting an inverse and any real entropic parameter , the -entropy of a probability law is
This construction is universal: by appropriate choices and limits of and , it recovers Boltzmann-Gibbs, Rényi, Tsallis, Sharma-Mittal, Kaniadakis, Borges–Roditi entropies, among others.
The key to composability is the existence of a formal group law: making the composite entropy
Composability, commutativity, associativity, and identity all reduce to the axioms of a commutative one-dimensional formal group. This underlying structure determines many of the statistical properties of the entropies and explains their compatibility with information-theoretic and statistical physics applications.
2. Mapping Entropies Across Families: Examples and Recoveries
All major classical and nonclassical entropy families arise as specific group laws or limiting cases of the -entropy formalism:
| Entropy Type | Generator or Law | Expression for |
|---|---|---|
| Boltzmann-Gibbs | (additive) | (limit ) |
| Rényi | (additive) | |
| Tsallis | (limit of Sharma-Mittal) | |
| Sharma-Mittal | multiplicative group law |
The same construction applies, by parameter deformation, to the Abel formal group, producing three-parameter families and interpolating between all previous cases. Extensions to quantum states are immediate, with replaced by eigenvalues of and the trace-form generalization.
3. Composability and Its Operational Consequences
Composability implies that the entropy function can be built recursively over composite or statistically independent systems, using the underlying group law: This is stronger than mere additivity; it allows for consistent entropy assignment in systems where additivity is replaced by generalized extensivity (superadditivity or subadditivity, depending on the entropy family), thus capturing non-extensive statistical behavior and ensuring that entropy remains a well-behaved measure under independent composition.
Further, composability can be characterized as the generalization of the fourth Shannon-Khinchin axiom, crucial for the sound development of statistical thermodynamics and generalized information theory in non-additive regimes.
4. Parametric Mapping and Continuity of Entropy Families
The entropy depends smoothly on both and all parameters entering , allowing a continuous mapping (in parameter space) between entropy types. This establishes a “manifold of entropies” where, for example:
- Moving retrieves classical additive entropy, whereas varying traverses the Tsallis microcanonical regime.
- The universal Lazard formal group, initial in the category of one-dimensional commutative formal groups, yields a generating function encoding all possible group-like entropy composition laws; all known cases are specializations.
The continuous mapping also provides a framework to analyze extensivity and phase-space scaling, such as the law
for the asymptotic phase-space growth necessary to maintain extensivity for large .
5. Mapping Entropy Regions and Geometric-Information Theoretic Structures
Beyond individual entropy functionals, the geometry of the set of possible entropic vectors (the “entropic region”) for random variables is a central object in information theory. As shown in (Liu et al., 2015), the set of entropic vectors forms a convex cone but is not polyhedral for . Mapping entropies in this context involves:
- Abstract algebraic enumeration of “supports” specifying possible probability distributions
- Geometric (information-geometric) parametrization of the manifold of probability distributions, using - (mixture) and - (exponential) coordinates, and interpreting faces of the entropic region as -autoparallel submanifolds
- Numerical optimization over non-isomorphic supports, yielding improved inner bounds on entropy regions
- Analytical mapping of facets (e.g., Shannon-type faces as affine subspaces in -coordinates) and identification of special supports attaining extremal entropy properties (e.g., maximal Ingleton violation)
A notable insight is the mapping of all Shannon-type equality faces of the entropy cone to -autoparallel sets within the information geometry of probability distributions.
6. Mapping Pseudo-Additive Entropies and Chain-Rule Mechanisms
The mapping of pseudo-additive entropies (such as the Tsallis family, which satisfies a -additive chain rule) into additive forms, as in Daróczy’s mapping theorem (Jizba et al., 2017), is crucial for establishing the operational equivalence between nonadditive and additive entropies:
- The Daróczy map transforms the additive chain rule into a -additive one
- Uniqueness results depend on the conditional entropy prescription (e.g., linear-escort averaging yields Tsallis entropy as the unique solution)
- Mapping the joint distribution via escort distributions clarifies the failure of De Finetti–Kolmogorov factorization except in special cases, guiding the conditions under which pseudo-additive entropies maintain their desired chain-rule properties
- This mapping process is fundamental for classifying and transferring thermodynamical properties across different entropy families, connecting, for example, superadditivity and extensivity classes (Landsberg’s classification)
7. Quantum, Dynamical, and Geometric Extensions
Mapping entropies paradigm extends seamlessly to quantum channels and dynamical systems:
- Unified-entropy map entropy for quantum channels (Rastegin, 2011) generalizes the trace-form entropies and satisfies a composability law akin to the classical case, with operational consequences such as continuity, sub-additivity, and entropy-exchange inequalities
- Mapping entropies via group laws underlies the algebraic properties of entanglement and information-theoretic inequalities in quantum information theory
- In dynamical systems, mapping entropic invariants (such as topological entropy, directional entropy, or algebraic entropy of mappings) relies on symbolic dynamics or algebraic structures—these, too, often admit group-like or compositional properties that can be mapped back to underlying entropy composition laws, embedding dynamical complexity into the mapping entropy framework
8. Mathematical and Practical Implications
The mapping of entropies via formal group theory and geometric constructions provides:
- A unified language for defining, categorizing, and comparing entropy functionals across contexts
- Rigorous criteria for composability, extensivity, and chain-rule compatibility
- Pathways for designing novel entropic measures with prescribed properties, via selection or deformation of formal group laws
- Insight into the algebraic and geometric constraints governing entropy regions, inequalities, and their operational meaning in information theory and statistical physics
Mapping entropies thus constitute a fundamental and unifying concept, establishing the formal and operational backbone of entropy theory and its generalizations, and allowing continuous deformation, composition, and application across the full spectrum of classical, quantum, and non-extensive statistical paradigms (Tempesta, 2015, Liu et al., 2015, Jizba et al., 2017).