Probability-Phase Mutual Information
- Probability-phase mutual information is a quantum measure that quantifies correlations between measurable probabilities and hidden phase data in ensembles of pure states.
- It leverages geometric quantum mechanics on complex projective spaces to capture ensemble-level coherence beyond standard density matrix methods.
- This measure is vital in quantum resource theories, offering insights into hidden coherence in applications like quantum computation, communication, and conditional thermalization.
Probability-phase mutual information (I(P;Φ)) measures the statistical correlation between measurement-accessible probabilities and measurement-inaccessible phases in an ensemble of quantum pure states, distinctively capturing ensemble-level coherence that standard density-matrix-based coherence measures cannot resolve. In the geometric formulation of quantum mechanics, where a quantum state is represented on complex projective space ℂPD–1 with coordinates separated into the probability simplex Σ(D–1) and the phase torus 𝕋(D–1), I(P;Φ) quantifies the degree to which the probability and phase components are correlated across the ensemble. This measure is central for diagnosing coherence resources and for distinguishing operationally relevant features in quantum resource theories, especially when averaging over the ensemble obscures non-classical structure.
1. Definition and Significance
Probability-phase mutual information is defined for an ensemble of pure quantum states as the mutual information between the probability variables P (arising, for example, as coefficients in a chosen basis) and phase variables Φ (the corresponding phase angles), considered over the ensemble's distribution q(p,φ) in ℂPD–1. Explicitly,
where μ{(P,Φ)} is the joint measure (Fubini-Study), μ_P and μΦ are the respective marginals, and D_KL denotes the Kullback-Leibler divergence.
This measure quantifies how much knowledge of the probability components of an ensemble reduces uncertainty about phase components, thereby operationally defining ensemble-level coherence. Standard coherence measures on density matrices ignore this structure, as they cannot distinguish between ensembles with identical density matrices but disparate pure-state distributions—meaning that crucial information about coherence resources is lost upon statistical averaging.
2. Mathematical Formulation
Formally, for a joint ensemble measure q(p,φ) induced by the Fubini-Study metric, the probability-phase mutual information is given by
with H(P), H(Φ), H(P,Φ) denoting geometric entropies of the corresponding marginal and joint distributions. The mutual information can also be analyzed at different coarse-graining scales ε, with entropy scaling
for Z = (P,Φ), and similarly for the marginals; the finite part I(P;Φ) quantifies the probability-phase mutual information as ε → 0.
Establishing I(P;Φ) as a bona-fide coherence monotone, the paper proves that it satisfies essential coherence resource-theory axioms: non-negativity, monotonicity under free operations, strong monotonicity (on average under selective measurements), convexity, additivity, and vanishing on pure states. The monotonicity under geometric “free channels”—which decompose as independent operations on the probability and phase coordinates—directly follows via the data-processing inequality for mutual information, and proofs are provided for all properties.
3. Comparison with Standard Coherence Measures
Traditional coherence quantifiers, such as the relative entropy of coherence
where Δ is the dephasing channel, only depend on density matrix averages. They consider coherence to be a property of single states, not ensembles; consequently, different ensembles with the same ρ are indistinguishable under these measures.
Probability-phase mutual information can resolve this ambiguity. For instance, consider two ensembles yielding the same density matrix: I(P;Φ) may distinguish them by detecting higher-order correlations between their probability and phase structures. For pure states—Dirac measures—the mutual information vanishes (I(P;Φ) = 0), as the state has no ensemble spread, even if the pure state displays superposition (basis coherence). In contrast, in mixed states, where ensemble structure is richer, I(P;Φ) can be strictly positive and encode information inaccessible to standard density matrix analyses.
4. Applications and Implications
The ensemble-level coherence measured by I(P;Φ) is pivotal in regimes where the preparation or evolution induces nontrivial correlations between probabilities and phases—such as in deep or geometric thermalization, where the projected ensembles reveal statistical structures not evident from the density matrix. In resource-theoretic frameworks, where “free operations” are defined at the ensemble level, I(P;Φ) directly quantifies operationally available coherence that could be harnessed for quantum computation, communication, or state discrimination.
A key operational implication is that phenomena such as conditional thermalization—where ensembles develop temperature-dependent probability-phase correlations—can be accurately characterized only via I(P;Φ), not density-matrix measures. Likewise, protocols sensitive to ensemble-level coherence (e.g., interference-based computations or conditional encoding strategies) depend on the structure exposed by probability-phase mutual information.
5. Coherence Surplus
quantifies the excess of ensemble-level coherence over the standard density matrix coherence. Using the chain rule for relative entropy,
with D_KL(\mu_Φ || unif_Φ) capturing the nonuniformity of phase distribution beyond what is encoded in ρ.
The surplus reveals the “hidden” coherence lost in the passage to density matrices; it provides an upper bound for standard coherence: Thus, probability-phase mutual information and the phase non-uniformity together account for all accessible ensemble-level coherence, with standard measures reflecting only the part surviving statistical averaging.
6. Operational Resource Theory and Free Operations
In the geometric coherence theory developed, free operations are channels acting independently on probability and phase: . Under such operations, the mutual information I(P;Φ) can only decrease, securing its role as a monotone. The theory formally situates probability-phase mutual information as the correct resource quantifier for operational scenarios where the full ensemble, not just its density matrix, is accessible or manipulable.
7. Summary and Perspectives
Probability-phase mutual information provides a rigorous, operationally motivated, ensemble-level coherence measure based on the geometric structure of quantum state space. By extending beyond density-matrix superposition, it quantifies statistical correlations lost in the process of averaging and thereby enables resource quantification and discrimination between quantum communication and computation protocols that exploit the full ensemble structure. The coherence surplus further specifies the degree to which ensemble-level coherence exceeds density-matrix coherence, highlighting the subtleties of quantum statistical structure relevant for advanced resource-theoretic and thermodynamic analyses.
This framework establishes I(P;Φ) as a foundational tool in quantum information theory for capturing and manipulating coherence resources in settings where ensemble preparation, deep thermalization, or noncommuting measurement protocols require description beyond the density matrix formalism (Hahn et al., 1 Oct 2025).