Papers
Topics
Authors
Recent
Search
2000 character limit reached

O-Information in Complex Systems

Updated 25 March 2026
  • O-information is a multivariate measure that quantifies the net balance between redundancy (overlapping information) and synergy (emergent patterns) in systems with three or more variables.
  • It is mathematically defined as the difference between total correlation and dual total correlation, allowing for efficient scaling and clear detection of higher-order dependencies.
  • Applications span neuroscience, network physiology, and machine learning, with extensions like local, dynamic, and structured O-information enhancing targeted analysis.

O-information is a multivariate information-theoretic functional that quantifies the net balance between redundancy and synergy in systems of three or more random variables. Unlike classical measures such as mutual information or total correlation that are sensitive chiefly to lower-order (pairwise or all-to-all) dependencies, O-information rigorously characterizes whether a system’s higher-order dependencies are dominated by overlapping (redundant) information or by emergent, jointly-encoded (synergistic) structure. The O-information is symmetric, scales efficiently with system size, and applies to both static and dynamical data. It is now a central tool for the analysis of high-order interactions across neuroscience, network physiology, complex systems, and machine learning.

1. Formal Definition and Mathematical Foundations

Let Xn=(X1,,Xn)X^n = (X_1, \dots, X_n) be an nn-dimensional discrete or continuous random vector, where each XjX_j is defined on a finite or continuous alphabet. The core formulation of O-information is as the difference of two classical multivariate information measures:

  • Total correlation (TC) (a.k.a. multi-information, CC):

C(Xn)j=1nH(Xj)H(Xn)C(X^n) \equiv \sum_{j=1}^n H(X_j) - H(X^n)

  • Dual total correlation (DTC) (a.k.a. binding entropy, BB):

B(Xn)H(Xn)j=1nH(XjXjn)B(X^n) \equiv H(X^n) - \sum_{j=1}^n H(X_j \mid X_{-j}^n)

where XjnX_{-j}^n denotes the vector with XjX_j removed.

Ω(Xn)C(Xn)B(Xn) =(n2)H(Xn)+j=1n[H(Xj)H(Xjn)]\boxed{ \Omega(X^n) \equiv C(X^n) - B(X^n) \ = (n-2)\,H(X^n) + \sum_{j=1}^n [\,H(X_j) - H(X_{-j}^n)\,] }

Alternative formulations and key algebraic decompositions include:

  • For n=3n = 3, Ω(X1,X2,X3)\Omega(X_1, X_2, X_3) reduces to interaction information:

Ω(X1,X2,X3)=I(X1;X2;X3)=I(X1;X2)I(X1;X2X3)\Omega(X_1, X_2, X_3) = I(X_1; X_2; X_3) = I(X_1; X_2) - I(X_1; X_2 \mid X_3)

  • O-information can be written as a sum over three-body interaction informations for nn variables:

Ω(Xn)=k=2n1I(Xk;Xk1;Xk+1n)\Omega(X^n) = \sum_{k=2}^{n-1} I(X_k; X^{k-1}; X_{k+1}^n)

for any permutation of indices.

These forms highlight that O-information measures departures from pure pairwise-decomposable structure—vanishing for systems composed solely of independent pairs or trees, and amplifying in the presence of nontrivial multipartite dependencies (Rosas et al., 2019, Varley, 12 Jan 2026).

2. Operational Interpretation: Redundancy vs. Synergy

O-information provides a signed, scalar diagnostic of a system’s organization:

  • Ω(Xn)>0\Omega(X^n) > 0: Redundancy-dominated. Several variables encode overlapping information, and collective constraints (i.e., the same “bits” appearing repeatedly) dominate. This occurs in systems with strong copy-like or degenerate structure (e.g., all variables are identical).
  • Ω(Xn)<0\Omega(X^n) < 0: Synergy-dominated. Information is encoded only at the global, joint level; high-order, emergent patterns are present that are invisible in any subset. Archetypal examples include the nn-bit XOR, where global structure is maximal but no pairwise marginals are informative.
  • Ω(Xn)=0\Omega(X^n) = 0: Balance, or exclusively pairwise dependencies (e.g., Gaussian Markov trees); redundancy and synergy exactly cancel (Rosas et al., 2019, Varley, 12 Jan 2026).

Sign and magnitude reflect the “order” of dominant interdependencies, as formalized via the Δk\Delta^k measures (Varley, 12 Jan 2026). O-information (O(X)=Δ2(X)O(X) = -\Delta^2(X)) discriminates whether interactions of order strictly above or below two predominate.

3. Key Analytical Properties and Extensions

O-information exhibits several fundamental properties (Rosas et al., 2019, Varley, 12 Jan 2026):

Property Description
Symmetry Invariant under permutations of variables
Pairwise triviality Ω(X1,X2)=0\Omega(X_1, X_2) = 0 for any joint distribution
Additivity For independent subsystems, Ω(X,Y)=Ω(X)+Ω(Y)\Omega(X,Y) = \Omega(X) + \Omega(Y)
Extremal values (2n)logmΩ(Xn)(n2)logm(2-n)\log m \leq \Omega(X^n) \leq (n-2)\log m for variables of mm states; bounds are tight
Zero for pairwise-only systems Holds for all tree-structured dependencies or independent pairs
Scaling and decomposability Decomposes as sum of three-body interaction informations, efficient for large nn

Extensions include:

  • Local and gradient O-information: Pointwise or variable/pair-specific contributions, such as local O-information ω(xn)\omega(x^n) (Scagliarini et al., 2021), and discrete gradients iΩ(Xn)=Ω(Xn)Ω(Xin){}_i\Omega(X^n) = \Omega(X^n) - \Omega(X^n_{-i}) that localize redundancy/synergy structure (Scagliarini et al., 2022).
  • Structured O-information: Integration over variable groups, focusing on between-group redundancy or synergy while ignoring within-group effects; crucial for modular networks (Pascual-Marqui et al., 11 Jul 2025).
  • Dynamic/dynamical O-information: Generalization to multivariate time series using entropy rates and transfer entropy expansions (Mijatovic et al., 2024, Stramaglia et al., 2020).
  • Hierarchy via Δk\Delta^k: The O-information is Δ2(X)-\Delta^2(X) in the Δk\Delta^k spectrum; this connects quantitative redundancy/synergy detection at any order (Varley, 12 Jan 2026).

4. Estimation Methods and Practical Algorithms

Calculating O-information for real data requires estimation of joint entropies and/or total correlations on the observed multivariate distribution. The core algorithmic steps are:

  1. Estimate H(Xn)H(X^n) and all needed marginals H(Xj),H(Xj),H(XjXj)H(X_j), H(X_{-j}), H(X_j\mid X_{-j}).
  2. Compute TC and DTC (or directly plug into the (n2)H(Xn)+j[H(Xj)H(Xj)](n-2)H(X^n) + \sum_j[H(X_j) - H(X_{-j})] formula).
  3. For "leave-one-out" versions or gradients, work with Ω(Xin)\Omega(X^n_{-i}) as needed.

Estimation approaches:

O-information is computationally tractable for moderate nn (linear in nn provided entropy and conditional entropy estimations are feasible), contrasting favorably with the exponential scaling of full partial information decomposition (PID) (Bounoua et al., 2024).

5. Relationships to Other High-Order Information Measures

O-information is situated at the intersection of several classical and modern multivariate information metrics (Rosas et al., 2019, Varley, 12 Jan 2026):

  • Total correlation (TC, multi-information): Quantifies overall constraint and redundancy; always nonnegative.
  • Dual total correlation (DTC): Associated with synergy; largest when only joint configurations contain nontrivial information.
  • Tononi–Sporns–Edelman (TSE) complexity: Sensitive to dependency strength but not to redundancy/synergy balance—TSE conflates the two, while O-information distinguishes them.
  • Partial information decomposition (PID): Decomposes mutual information into redundant, unique, and synergistic atoms; O-information is fully symmetric and target-agnostic, providing a faster, coarser redundancy-synergy index that scales to high nn.
  • Δk\Delta^k and Γk\Gamma^k families: O-information (Δ2(X)-\Delta^2(X)) sits within a hierarchy of whole-minus-parts statistics that distinguish progressively higher-order dependency structures (Varley, 12 Jan 2026).

6. Example Applications and Empirical Results

Neuroscience and Physiology: O-information has been applied to fMRI, EEG, and large-scale spike train datasets to reveal redundant and synergistic brain subsystems, particularly in studies of conscious perception and neural integration (Bounoua et al., 2024, Stramaglia et al., 2020, Scagliarini et al., 2021, Mijatovic et al., 2024).

Music Analysis: Comparative studies of Bach's four-voice chorales and Corelli's string trios demonstrated that Bach’s music is synergy-dominated (negative O-information), indicating high-order constraints not captured by pairwise analysis, while Corelli’s texture is redundancy-dominated (positive O-information) due to explicit voice doubling (Rosas et al., 2019, Scagliarini et al., 2021).

Machine Learning and Quantum-Inspired Architectures: O-information tracked the emergence of generalization (grokking) in tensor-network classifiers, coinciding with transitions in entanglement entropy and test accuracy (Pomarico et al., 31 Jul 2025).

Multivariate Time Series and Networks: The dynamic (rate) formulation enables quantification of high-order interactions in physiological and synthetic networks, distinguishing reconfigurations in redundancy and synergy under experimental perturbations (Mijatovic et al., 2024).

Complex Systems and Statistical Mechanics: In spin systems, O-information signals the presence of higher-order couplings, transitioning from near zero for pairwise-only interactions to large negative values as kk-body interactions grow (Rosas et al., 2019).

7. Limitations, Current Generalizations, and Future Research

Limitations:

Current Developments:

Future Directions:

  • Integration of O-information with causal, dynamical, and PID-style decompositions.
  • Scalability to larger nn via network sparsity, efficient bias-corrected estimators, or exploitation of system-specific structure.
  • Application to ecological, genetic, and highly modular data with group‐wise formulations.

In summary, O-information provides a principled, scalable, and interpretable scalar diagnostic for high-order information structure in multivariate systems, facilitating quantitative discrimination between redundancy- and synergy-dominated regimes, localizing high-order informational circuits, and advancing the theoretical and applied analysis of complex networks (Rosas et al., 2019, Varley, 12 Jan 2026, Bounoua et al., 2024, Scagliarini et al., 2022, Pascual-Marqui et al., 11 Jul 2025, Scagliarini et al., 2021, Mijatovic et al., 2024, Pomarico et al., 31 Jul 2025, Stramaglia et al., 2020).

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to O-Information.