Geometric Decomposition of Information Flow
- Geometric decomposition of information flow is a framework that employs geometric and convex tools to partition mutual information and dependency structures into interpretable, orthogonal components.
- It leverages information geometry and poset-based methods to yield rigorous invariants, enabling the precise quantification of redundancy, synergy, and thermodynamic costs.
- The approach integrates network theory and measure theory to analyze feedback and directional flows, providing actionable insights for both classical and quantum systems.
Geometric decomposition of information flow refers to a diverse set of frameworks in which information-theoretic measures—such as mutual information, information flow between subsystems, or fine-grained dependency structures—are decomposed using geometric or structural tools from information geometry, convex geometry, network theory, and measure theory. These methods yield rigorous invariants, interpretable decompositions, and structure–function relationships in classical, quantum, and complex systems, enabling precise quantification of different modes of information transfer, redundancy, synergy, thermodynamic cost, and feedback.
1. Foundations: Information Geometry and Decomposition Principles
Central to geometric decompositions is the identification of the probability simplex or parameter manifold as a differentiable, often Riemannian, manifold. Canonical structures include the Fisher information metric, divergence functions (KL, Wasserstein), and dual affine connections. The decomposition proceeds by projecting or splitting flows, divergences, or measure-theoretic objects into geometrically meaningful pieces:
- For families of distributions on event lattices or posets, dually flat geometry guarantees orthogonality of coordinate subspaces, so the Kullback–Leibler divergence admits a unique "Pythagorean" decomposition along chosen submanifolds (Sugiyama et al., 2016).
- In network and Markov-process settings, flows (currents) and conjugate thermodynamic forces are split geometrically into cyclic (housekeeping), conservative (1), and acyclic components by orthogonal projections in an appropriate tangent or polytope space (Maekawa et al., 26 Sep 2025, Ito et al., 28 Dec 2025, Homs-Dones et al., 14 Jun 2025).
- For information quantities such as mutual or co-information, geometric decompositions act on the underlying information space: atoms correspond to open simplices equipped with signed entropy measures, yielding a sum-of-parts over the lattice of event distinctions (Down et al., 2023, Bertschinger et al., 2012).
- In statistical manifolds of finite events, the Amari–Nagaoka dualistic geometry underpins the decomposition of information-theoretic quantities along any poset ordering, enabling both hierarchical and customized decompositions (Sugiyama et al., 2016).
The geometric approach generalizes classical inclusion-exclusion formulas and the Möbius inversion of information measures, producing atomic or orthogonal decompositions that encode the fine structure of statistical dependency.
2. Geometric Decomposition in Information Thermodynamics
Information flow between subsystems and its physical consequences have motivated the explicit geometric splitting of flow rates in autonomous stochastic systems:
- Markov jump systems. For a bipartite Markov process, the information flow rate between and is split into two geometric terms (Maekawa et al., 26 Sep 2025):
- Housekeeping flow is generated by probability currents along cycles in the network and is associated with maintaining steady-state correlations.
- Excess flow occurs along conservative (gradient) directions, representing genuine changes in mutual information.
- The decomposition is defined via projections of thermodynamic forces onto the gradient image and kernel, resulting in splitting of entropy production and second-law–like inequalities per component.
- Overdamped Langevin systems. The geometric split generalizes cleanly to Langevin dynamics (Ito et al., 28 Dec 2025):
- The excess component is identified by the potential part of the force field, corresponding precisely to paths of optimal transport (i.e., minimizing $2$-Wasserstein distance) between marginal distributions.
- The housekeeping component is the divergence-free, rotational part, and its energetic cost can be analyzed using Koopman-mode decompositions and spectral properties.
- This geometric decomposition yields thermodynamic uncertainty relations and subsystem-specific speed limits tightly bounded by the excess flow. The presence of "excess" and "housekeeping demons" is determined by signs of partial entropy dissipation.
The table below summarizes core terms in these frameworks:
| Component | Definition/Origin | Interpretation |
|---|---|---|
| Housekeeping | Cyclic current/rotational force | Maintains steady-state, no MI change |
| Excess | Gradient current/conservative force | Drives MI change, cost = OT distance |
| Circular (Network) | Cycle or feedback subnetwork | Information trapped in system cycles |
| Acyclic | Gradient/throughflow | Directional, net information transfer |
3. Geometric Decompositions in Network and System Flows
- Circular–Directional Flow Decomposition (CDFD): On a weighted directed network, any flow can be uniquely decomposed into a divergence-free (circular) component and an acyclic (directional) component (Homs-Dones et al., 14 Jun 2025). The circularity index quantifies the fraction of flow involved in cycles:
The decomposition space is a convex polytope-complex, with two canonical benchmark flows:
- Maximum circularity (found via minimum-cost flow optimization).
- Balanced flow forwarding (BFF): a unique, locally computable split that proportionally distributes circular flow. These structures enable analysis of feedback, redundancy, directionality, and flow inefficiency in data, infrastructure, or biological systems.
- Network filtering via information geometry: In modular networks, identifying high- and low-information nodes using local curvature (shape operator derived from Fisher information) divides the network into complementary subgraphs (HI and LO) that separately capture boundary vs. consensus information flow (Levada, 2024).
4. Atomic and Poset-Based Decomposition: Logarithmic and Structured Spaces
- Logarithmic Decomposition (LD): The information content of discrete random variables is reified as signed measures on the space of open simplices (logarithmic atoms) (Down et al., 2023):
- Shannon and mutual information correspond to signed measures over unions/intersections of content sets.
- Any "logarithmically decomposable" quantity (including Gács–Körner common information, total correlation, O-information) is expressed as a sum over precise geometric atoms.
- The Möbius inversion on the lattice provides canonical atomic weights with alternating sign structure, refining classic I-diagram regions and connecting coherently to Yeung's I-measure.
- Information geometry on posets: Orthogonal decomposition of KL divergences is governed by the dually-flat geometry of the statistical manifold over a finite poset (Sugiyama et al., 2016). For any chain of event subsets (or powerset lattice), total divergence is split additively over orthogonal subspaces, providing a rigorous foundation for (partial) hierarchical decompositions of entropy and information in high-order systems.
5. Convex Geometry and Partial Information Decomposition
- In the geometric partial information decomposition (PID) approach, the structure of shared, unique, and synergistic information is mapped to intersection and convex-hull overlap in the simplex of posteriors (Bertschinger et al., 2012):
- Shared (redundant) information corresponds to the least-informative point common to the convex hulls of all posterior beliefs given different predictors.
- Unique information is the excess of each convex hull beyond the intersection; synergy is the leftover that is available only in the joint support.
- PID-lattice structure encodes the partial order of information subtasks and is computed algorithmically via convex projections and Möbius inversion.
- This approach connects geometric overlap to operational semantics (shared knowledge, game theory), though computational and axiomatic challenges remain for higher-order cases.
6. Information Flow Decomposition in Time and Computation
- The shared mass exclusion framework enables geometric, atomic decomposition of information flow from past to future in time-series data (Varley, 2022):
- Local redundancy atoms , defined via probability mass exclusions, are assigned to nodes in the product PID lattice for multiple past and future variables.
- Möbius inversion on the product lattice yields mutually exclusive atomic flows, reconstructing pointwise and global mutual information as the sum over these interaction atoms.
- This decomposition is practical and fully localizable, capturing temporally extended, synergistic, or redundant structure in neural, dynamical, and computational systems.
7. Physical and Quantum Geometric-Phase Decomposition
- In open quantum systems, the trace distance evolution between system and reference states is partitioned into "forward" (irreversible, Markovian) and "backward" (reversible, non-Markovian) flows (Wu et al., 2010), encoding loss and recovery of system information.
- The acquired geometric phase (Uhlmann–Tong generalization) is shown to depend analytically on the balance of forward/backward information fluxes. The phase is maximal in a purely Markovian regime and suppressed by non-Markovian backflow, enabling operational detection of memory effects as extremal points in the geometric phase landscape.
8. Geometric and Emergent Approximations in Complex Systems
- Information graph flows map high-dimensional quantum or statistical states onto sparse, low-dimensional geometric graphs (lattices) by evolving mutual-information adjacency matrices with nonlinear ODEs (Vanchurin, 2017):
- The process induces emergent metrics and Ricci-type flows, producing effective geometric field theories over the coarse-grained system.
- This approach geometrizes information structure, facilitating analytical and numerical study of metric emergence, sparsification, and topological transitions.
References
- Decompositions and thermodynamic splitting: (Maekawa et al., 26 Sep 2025, Ito et al., 28 Dec 2025, Homs-Dones et al., 14 Jun 2025)
- Logarithmic, atomic, and poset-based decompositions: (Down et al., 2023, Sugiyama et al., 2016)
- Partial information decomposition and convex geometry: (Bertschinger et al., 2012)
- Time-dependent atomic decomposition: (Varley, 2022)
- Network geometric decompositions: (Levada, 2024)
- Quantum and physical geometric-phase decomposition: (Wu et al., 2010)
- Emergent geometric flow in high-dimensional systems: (Vanchurin, 2017)
In summary, geometric decomposition of information flow encompasses a broad array of rigorous structural tools—ranging from orthogonal splitting of forces in nonequilibrium thermodynamics, atomic Möbius decompositions in information measure theory, and convex-hull overlaps in PID, to network and metric-based separations. These frameworks enable precise partitioning of information into interpretable flows or atoms, deliver measurable or computable invariants, and ground the analysis of feedback, redundancy, synergy, cost, and irreversibility in both classical and quantum complex systems.