General Divergence Regularized Optimal Transport: Sample Complexity and Central Limit Theorems (2510.02489v1)
Abstract: Optimal transport has emerged as a fundamental methodology with applications spanning multiple research areas in recent years. However, the convergence rate of the empirical estimator to its population counterpart suffers from the curse of dimensionality, which prevents its application in high-dimensional spaces. While entropic regularization has been proven to effectively mitigate the curse of dimensionality and achieve a parametric convergence rate under mild conditions, these statistical guarantees have not been extended to general regularizers. Our work bridges this gap by establishing analogous results for a broader family of regularizers. Specifically, under boundedness constraints, we prove a convergence rate of order $n{-1/2} with respect to sample size n. Furthermore, we derive several central limit theorems for divergence regularized optimal transport.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.