Convergence of drift-diffusion PDEs arising as Wasserstein gradient flows of convex functions (2507.12385v1)
Abstract: We study the quantitative convergence of drift-diffusion PDEs that arise as Wasserstein gradient flows of linearly convex functions over the space of probability measures on ${\mathbb R}d$. In this setting, the objective is in general not displacement convex, so it is not clear a priori whether global convergence even holds. Still, our analysis reveals that diffusion {allows} a favorable interaction between Wasserstein geometry and linear convexity, leading to a general quantitative convergence theory, analogous to that of gradient flows in convex settings in the Euclidean space. Specifically, we prove that if the objective is convex and suitably coercive, the suboptimality gap decreases at a rate $O(1/t)$. This improves to a rate faster than any polynomial -- or even exponential in compact settings -- when the objective is strongly convex relative to the entropy. Our results extend the range of mean-field Langevin dynamics that enjoy quantitative convergence guarantees, and enable new applications to optimization over the space of probability measures. To illustrate this, we show quantitative convergence results for the minimization of entropy-regularized nonconvex problems, we propose and study an \emph{approximate Fisher Information} regularization covered by our setting, and we apply our results to an estimator for trajectory inference which involves the minimization of the relative entropy with respect to the Wiener measure in path space.