Dice Question Streamline Icon: https://streamlinehq.com

Prove super-exponential convergence of the Sinkhorn-Newton-Sparse algorithm

Prove that the Sinkhorn-Newton-Sparse (SNS) algorithm—comprising an initial phase of Sinkhorn matrix-scaling iterations followed by Newton iterations that compute search directions by solving a linear system defined by a sparsified approximation to the Hessian of the Lyapunov potential using conjugate gradients and line search—achieves super-exponential convergence, i.e., that its sub-optimality gap decreases at a super-exponential rate across iterations under appropriate conditions.

Information Square Streamline Icon: https://streamlinehq.com

Background

The paper introduces Sinkhorn-Newton-Sparse (SNS), a two-stage algorithm for entropic optimal transport. After a warm-start with Sinkhorn matrix scaling, SNS performs Newton-type iterations that solve a sparsified Hessian linear system for the Lyapunov potential, achieving O(n2) per-iteration complexity. Empirically, the authors observe very low iteration counts that suggest super-exponential convergence during the Newton stage.

Despite strong empirical evidence, the authors explicitly state they cannot currently provide a theoretical proof of super-exponential convergence. Establishing such a result would rigorously validate the observed rapid convergence of SNS and clarify its theoretical guarantees.

References

While we cannot prove the conjectured super-exponential convergence, the low iteration count in the Newton stage shows strong numerical support.

Accelerating Sinkhorn Algorithm with Sparse Newton Iterations (2401.12253 - Tang et al., 20 Jan 2024) in Section 6, Numerical result (paragraph preceding Table 1)