Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
162 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Manifold learning in Wasserstein space (2311.08549v2)

Published 14 Nov 2023 in stat.ML, cs.LG, and math.DG

Abstract: This paper aims at building the theoretical foundations for manifold learning algorithms in the space of absolutely continuous probability measures on a compact and convex subset of $\mathbb{R}d$, metrized with the Wasserstein-2 distance $\mathrm{W}$. We begin by introducing a construction of submanifolds $\Lambda$ of probability measures equipped with metric $\mathrm{W}\Lambda$, the geodesic restriction of $W$ to $\Lambda$. In contrast to other constructions, these submanifolds are not necessarily flat, but still allow for local linearizations in a similar fashion to Riemannian submanifolds of $\mathbb{R}d$. We then show how the latent manifold structure of $(\Lambda,\mathrm{W}{\Lambda})$ can be learned from samples ${\lambda_i}{i=1}N$ of $\Lambda$ and pairwise extrinsic Wasserstein distances $\mathrm{W}$ only. In particular, we show that the metric space $(\Lambda,\mathrm{W}{\Lambda})$ can be asymptotically recovered in the sense of Gromov--Wasserstein from a graph with nodes ${\lambda_i}{i=1}N$ and edge weights $W(\lambda_i,\lambda_j)$. In addition, we demonstrate how the tangent space at a sample $\lambda$ can be asymptotically recovered via spectral analysis of a suitable "covariance operator" using optimal transport maps from $\lambda$ to sufficiently close and diverse samples ${\lambda_i}{i=1}N$. The paper closes with some explicit constructions of submanifolds $\Lambda$ and numerical examples on the recovery of tangent spaces through spectral analysis.

Citations (3)

Summary

We haven't generated a summary for this paper yet.