Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Nearly $d$-Linear Convergence Bounds for Diffusion Models via Stochastic Localization (2308.03686v3)

Published 7 Aug 2023 in stat.ML and cs.LG

Abstract: Denoising diffusions are a powerful method to generate approximate samples from high-dimensional data distributions. Recent results provide polynomial bounds on their convergence rate, assuming $L2$-accurate scores. Until now, the tightest bounds were either superlinear in the data dimension or required strong smoothness assumptions. We provide the first convergence bounds which are linear in the data dimension (up to logarithmic factors) assuming only finite second moments of the data distribution. We show that diffusion models require at most $\tilde O(\frac{d \log2(1/\delta)}{\varepsilon2})$ steps to approximate an arbitrary distribution on $\mathbb{R}d$ corrupted with Gaussian noise of variance $\delta$ to within $\varepsilon2$ in KL divergence. Our proof extends the Girsanov-based methods of previous works. We introduce a refined treatment of the error from discretizing the reverse SDE inspired by stochastic localization.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Joe Benton (13 papers)
  2. Valentin De Bortoli (50 papers)
  3. Arnaud Doucet (161 papers)
  4. George Deligiannidis (55 papers)
Citations (75)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com