Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Convergence Analysis of Probability Flow ODE for Score-based Generative Models (2404.09730v2)

Published 15 Apr 2024 in cs.LG, cs.NA, math.CA, and math.NA

Abstract: Score-based generative models have emerged as a powerful approach for sampling high-dimensional probability distributions. Despite their effectiveness, their theoretical underpinnings remain relatively underdeveloped. In this work, we study the convergence properties of deterministic samplers based on probability flow ODEs from both theoretical and numerical perspectives. Assuming access to $L2$-accurate estimates of the score function, we prove the total variation between the target and the generated data distributions can be bounded above by $\mathcal{O}(d{3/4}\delta{1/2})$ in the continuous time level, where $d$ denotes the data dimension and $\delta$ represents the $L2$-score matching error. For practical implementations using a $p$-th order Runge-Kutta integrator with step size $h$, we establish error bounds of $\mathcal{O}(d{3/4}\delta{1/2} + d\cdot(dh)p)$ at the discrete level. Finally, we present numerical studies on problems up to 128 dimensions to verify our theory.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Daniel Zhengyu Huang (19 papers)
  2. Jiaoyang Huang (67 papers)
  3. Zhengjiang Lin (8 papers)
Citations (15)

Summary

We haven't generated a summary for this paper yet.