The Riemannian geometry of Sinkhorn divergences (2405.04987v1)
Abstract: We propose a new metric between probability measures on a compact metric space that mirrors the Riemannian manifold-like structure of quadratic optimal transport but includes entropic regularization. Its metric tensor is given by the Hessian of the Sinkhorn divergence, a debiased variant of entropic optimal transport. We precisely identify the tangent space it induces, which turns out to be related to a Reproducing Kernel Hilbert Space (RKHS). As usual in Riemannian geometry, the distance is built by looking for shortest paths. We prove that our distance is geodesic, metrizes the weak-star topology, and is equivalent to a RKHS norm. Still it retains the geometric flavor of optimal transport: as a paradigmatic example, translations are geodesics for the quadratic cost on $\mathbb{R}d$. We also show two negative results on the Sinkhorn divergence that may be of independent interest: that it is not jointly convex, and that its square root is not a distance because it fails to satisfy the triangle inequality.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.