Prove super-exponential convergence of the Sinkhorn-Newton-Sparse algorithm
Prove that the Sinkhorn-Newton-Sparse (SNS) algorithm—comprising an initial phase of Sinkhorn matrix-scaling iterations followed by Newton iterations that compute search directions by solving a linear system defined by a sparsified approximation to the Hessian of the Lyapunov potential using conjugate gradients and line search—achieves super-exponential convergence, i.e., that its sub-optimality gap decreases at a super-exponential rate across iterations under appropriate conditions.
References
While we cannot prove the conjectured super-exponential convergence, the low iteration count in the Newton stage shows strong numerical support.
                — Accelerating Sinkhorn Algorithm with Sparse Newton Iterations
                
                (2401.12253 - Tang et al., 20 Jan 2024) in Section 6, Numerical result (paragraph preceding Table 1)