Dice Question Streamline Icon: https://streamlinehq.com

Determine whether operator Sinkhorn equals alternating minimization of a divergence

Determine whether there exists a divergence function on the manifold of density matrices (positive-definite density operators) such that the operator Sinkhorn algorithm for operator scaling can be expressed as alternating minimization of that divergence.

Information Square Streamline Icon: https://streamlinehq.com

Background

The classical Sinkhorn algorithm for matrix scaling is known to be equivalent to alternating minimization of the Kullback–Leibler divergence via alternating e-projections in information geometry. Its non-commutative analogue, operator scaling (sometimes called the operator Sinkhorn algorithm), has been shown to admit an interpretation as alternating e-projections with respect to the symmetric logarithmic derivative (SLD) metric in quantum information geometry.

However, the manifold of density matrices has non-vanishing torsion, which complicates the existence of a divergence whose alternating minimization reproduces the operator Sinkhorn iterations. Establishing or refuting such a divergence-based characterization would parallel the classical characterization by KL divergence and clarify the geometric nature of operator scaling.

References

However, due to the non-vanishing torsion in the space of density matrices, it is still open whether the operator Sinkhorn algorithm can be written as alternating minimization of some divergence.

Open problems in information geometry: a discussion at FDIG 2025 (2509.06989 - Sei et al., 2 Sep 2025) in Section: Takeru MATSUDA (The University of Tokyo)