Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 75 tok/s
Gemini 2.5 Pro 51 tok/s Pro
GPT-5 Medium 20 tok/s Pro
GPT-5 High 18 tok/s Pro
GPT-4o 95 tok/s Pro
Kimi K2 193 tok/s Pro
GPT OSS 120B 467 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

Monotonicity of Relative Entropy

Updated 17 September 2025
  • Monotonicity of relative entropy is a principle demonstrating that quantum operations do not increase the distinguishability between states, forming the basis of the data-processing inequality.
  • Theoretical proofs by Petz use operator algebra and recovery maps while Uhlmann’s approach employs quadratic interpolation, contrasting practical methodologies in quantum information theory.
  • This property impacts quantum thermodynamics, resource theories, and communication by bounding error rates and guiding the design of reversible quantum channels.

The monotonicity of relative entropy is a principle asserting that physical or information-processing transformations cannot increase the distinguishability between quantum states. This property underpins a vast portion of quantum information theory and statistical physics, and possesses deep mathematical foundations and operational consequences. In particular, it equates to the data-processing inequality, ensures the validity of major entropy inequalities, governs information loss under quantum channels, and connects reversibility to recoverability of quantum states.

1. Formal Definition and General Statement

Consider two quantum states (density operators) ρ\rho and σ\sigma on a Hilbert space. The quantum relative entropy is

S(ρσ)=Tr[ρlogρρlogσ].S(\rho\|\sigma) = \operatorname{Tr}[\rho \log \rho - \rho \log \sigma].

For any completely positive, trace-preserving (CPTP) map E\mathcal{E} (quantum channel), the monotonicity states: S(ρσ)S(E(ρ)E(σ)).S(\rho\|\sigma) \geq S(\mathcal{E}(\rho)\| \mathcal{E}(\sigma)). This inequality is known as the data-processing inequality (DPI) and forms the mathematical backbone for the impossibility of increasing information by local, physical transformations (1105.4865, Sagawa, 2012, Pérez-Pardo, 2022, Matheus et al., 14 Sep 2025).

2. Proof Methodologies: Petz vs Uhlmann

There are two principal, structurally distinct proof strategies: Petz's direct operator-algebraic approach and Uhlmann's quadratic form interpolation.

Petz's Operator-Algebraic Approach

Petz's approach reformulates relative entropy in terms of the modular operator Δρ,σ\Delta_{\rho,\sigma} on the Hilbert-Schmidt space: S(ρσ)=ρ1/2,logΔρ,σρ1/2.S(\rho\|\sigma) = -\langle \rho^{1/2}, \log \Delta_{\rho,\sigma} \,\rho^{1/2} \rangle. Upon coarse-graining (e.g., partial trace), Petz introduces an auxiliary isometric lifting operator VρV_\rho that intertwines the reduced and original modular structures. The initial attempt erroneously applied the contractive Jensen operator inequality, but the correct proof recognizes VρV_\rho as an isometry in the relevant cases. This enables valid use of the operator-convexity of logx- \log x for isometries, leading to

S(E(ρ)E(σ))S(ρσ),S(\mathcal{E}(\rho)\| \mathcal{E}(\sigma)) \leq S(\rho\|\sigma),

with equality if and only if there is a recovery map—the Petz recovery map—restoring ρ\rho from E(ρ)\mathcal{E}(\rho) (Matheus et al., 14 Sep 2025).

Uhlmann's Interpolation of Sesquilinear Forms

Uhlmann's route is based on Pusz-Woronowicz interpolation of positive sesquilinear forms. Given states ω\omega and ν\nu on a unital CC^*-algebra, Uhlmann constructs an interpolating quadratic form: Yt[ωR,νL](a,b)=[a],P1tQt[b]Y_t[\omega_R, \nu_L](a, b) = \langle [a], P^{1-t} Q^t [b] \rangle with (P,Q)(P,Q) commuting positive operators arising from GNS representations. The relative entropy is then defined as a Dini derivative at t=0t=0: S[ω,ν]=limt0+Yt[ωR,νL](e,e)ω(e,e)tS[\omega, \nu] = -\lim_{t\to 0^+} \frac{Y_t[\omega_R, \nu_L](e, e) - \omega(e, e)}{t} for ee the identity. Under a positive, unital (Schwarz) map φ\varphi, Uhlmann's argument shows the pull-back of quadratic forms preserves the ordering, yielding

S[ω,ν]S[ωφ,νφ]S[\omega, \nu] \geq S[\omega \circ \varphi, \nu \circ \varphi]

and, in the operator algebra context, generalizes effortlessly to type III von Neumann algebras and non-invertible states (Pérez-Pardo, 2022, Reible, 8 Jan 2025).

Comparative Analysis

Feature Petz Uhlmann
Conceptual Basis Modular operator, lifting isometries Positive form interpolation
Scope Intuitive, explicit when full rank Fully general, abstract
Recovery Map Link Direct to Petz recovery map General monotonicity
Applicability Finite-dimensional, invertible states All normal states
Technical Demand Operator algebra, isometric lifts Quadratic form calculus

(Matheus et al., 14 Sep 2025, Reible, 8 Jan 2025)

3. Operational Consequences in Quantum Information

a) Data-Processing and Entropic Uncertainty

Monotonicity clarifies that quantum measurements or noise invariably reduce the distinguishability of states—a fact harnessed for entropic uncertainty relations with quantum side information. It explains the trade-off in knowledge about complementary observables upon sequential or simultaneous measurement, leading to the formal uncertainty relation with quantum side information (UPQSI), where the monotonicity is explicitly used to derive lower bounds on joint conditional entropies (1105.4865): H(vc)+H(wb)logr(v,w)H(v|c) + H(w|b) \geq -\log r(v,w) with r(v,w)=maxj,kvjwk2r(v,w) = \max_{j,k} |\langle v_j|w_k \rangle|^2.

b) Thermodynamics and the Second Law

Monotonicity under CPTP maps underlies second law-like statements such as the Clausius inequality (WΔFW\geq \Delta F), and the quantum Hatano–Sasa inequality for nonequilibrium steady states, by ensuring non-increase of quantum relative entropy between state sequences under physical evolutions (Sagawa, 2012): D(E(ρ)E(σ))D(ρσ)D(\mathcal{E}(\rho) \| \mathcal{E}(\sigma)) \leq D(\rho \| \sigma) is used to show that the change in entropy is bounded below by excess contributions in driven thermodynamic processes.

c) Resource Theories and Channel Capacities

Monotonicity is essential for the characterization of monotones in resource theories, for error exponents in channel discrimination, and for bounding quantum capacities via sandwiched Rényi divergences. For the sandwiched Rényi divergence Dα(po)D_\alpha(p\|o), monotonicity is proved for all α1/2\alpha \geq 1/2, confirming its operational legitimacy in contexts like quantum hypothesis testing and one-shot channel coding (Frank et al., 2013).

d) Recovery Maps and Approximate Reversibility

The tightness of monotonicity is realized when the channel action is reversible. Petz's theorem asserts that equality occurs if and only if a Petz recovery map reconstructs ρ\rho from E(ρ)\mathcal{E}(\rho). Quantitative refinements, relating the decrease in relative entropy to the fidelity between ρ\rho and its recovered version, have been established (Berta et al., 2014, Sutter et al., 2015): D(ρσ)D(E(ρ)E(σ))logF(ρ,Rσ,E(E(ρ)))D(\rho\|\sigma) - D(\mathcal{E}(\rho)\| \mathcal{E}(\sigma)) \geq -\log F(\rho, \mathcal{R}_{\sigma, \mathcal{E}}(\mathcal{E}(\rho))) where FF denotes the quantum fidelity and Rσ,E\mathcal{R}_{\sigma, \mathcal{E}} is the Petz recovery map.

4. Extensions: Positive Maps, Infinite Dimensions, and Further Generalizations

a) Positive vs Completely Positive

Monotonicity holds under positive trace-preserving maps, not only CPTP (completely positive trace-preserving) maps. This broader validity exposes limitations in the use of relative entropy for detecting non-Markovianity; certain measures become blind if only positivity, not complete positivity, is checked (Müller-Hermes et al., 2015, Sargolzahi et al., 2019).

b) Infinite-Dimensional Operator Algebras

The relative entropy extends to infinite-dimensional von Neumann algebras via modular theory and vector representatives in standard forms. Uhlmann's monotonicity theorem (proven in the Araki–Uhlmann setting) guarantees monotonicity for normal positive functionals under unital Schwarz maps and, hence, covers all physical transformations in algebraic quantum field theory and statistical mechanics (Reible, 8 Jan 2025). Monotonicity is also crucial for defining the two-sided Bogoliubov inequality for KMS (thermal equilibrium) states, governing perturbations and free energy changes in the general setting.

c) Continuity and Stability Aspects

Monotonicity persists for "discontinuity jumps"—infinite sequences of states converging towards a limit. Under quantum operations, the local discontinuity jump of relative entropy cannot increase, implying further stability for limits and approximations in information-theoretic and thermodynamic settings (Shirokov, 2022).

5. Generalized and Axiomatic Contexts

Beyond the quantum case, the monotonicity of relative entropy arises axiomatically from the postulate that distinguishability cannot increase under noise (data-processing). For any operationally meaningful divergence DD, this property and additivity force DD to interpolate between minimal and maximal divergences (min- and max-relative entropies), and to possess continuity in the interior of the probability simplex. This establishes a bijection between entropy and relative entropy, and operationalizes why quantities like the quantum (or classical) relative entropy and Rényi divergences occupy their central role (Gour et al., 2020).

6. Geometric and Physical Analogues

In geometric contexts, such as the paper of hypersurfaces in hyperbolic space, a notion of “relative entropy” defined via renormalized areas satisfies a monotonicity property along mean curvature flow: the difference in renormalized area (relative entropy) between two hypersurfaces is non-increasing in time, quantifying geometric “closeness” in analogy with information theory (Yao, 2022).

Key Formulas and Conceptual Summary

  • Umegaki quantum relative entropy: D(ρσ)=Tr[ρ(logρlogσ)]D(\rho\|\sigma) = \operatorname{Tr}[\rho (\log \rho - \log \sigma)]
  • Monotonicity / Data Processing Inequality: D(E(ρ)E(σ))D(ρσ)D(\mathcal{E}(\rho)\| \mathcal{E}(\sigma)) \leq D(\rho\| \sigma) for CPTP E\mathcal{E}
  • Petz recovery map: Rσ,E(Y)=σ1/2E(E(σ)1/2YE(σ)1/2)σ1/2\mathcal{R}_{\sigma,\mathcal{E}}(Y) = \sigma^{1/2} \mathcal{E}^*(\mathcal{E}(\sigma)^{-1/2} Y \mathcal{E}(\sigma)^{-1/2}) \sigma^{1/2}
  • Characterization of equality: D(ρσ)=D(E(ρ)E(σ))D(\rho\|\sigma) = D(\mathcal{E}(\rho)\| \mathcal{E}(\sigma)) if and only if a Petz map recovers ρ\rho from E(ρ)\mathcal{E}(\rho).
  • Bounds via divergence families: Any data-processing and additive divergence DD satisfies DminDDmaxD_{\min} \leq D \leq D_{\max}, with Dmin(pq)=logisupp(p)qiD_{\min}(p\|q) = -\log \sum_{i\in \text{supp}(p)}q_i and Dmax(pq)=logmaxi(pi/qi)D_{\max}(p\|q) = \log \max_i (p_i/q_i) (Gour et al., 2020).
  • Von Neumann algebra generalization: For a unital Schwarz map α:M1M2\alpha: M_1 \to M_2 and positive normal functionals ψ1,2\psi_{1,2}, ϕ1,2\phi_{1,2} satisfying ψ2αψ1\psi_2 \circ \alpha \leq \psi_1 and ϕ2αϕ1\phi_2 \circ \alpha \leq \phi_1,

SM1(ψ1,ϕ1)SM2(ψ2,ϕ2)S_{M_1}(\psi_1, \phi_1) \leq S_{M_2}(\psi_2, \phi_2)

(Reible, 8 Jan 2025).

References (arXiv ids)

These results collectively place the monotonicity of relative entropy at the heart of quantum information theory, statistical mechanics, and mathematical physics, providing an essential constraint on physical, information-theoretic, and even geometric processes.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (19)
Forward Email Streamline Icon: https://streamlinehq.com

Follow Topic

Get notified by email when new papers are published related to Monotonicity of Relative Entropy.