Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

To smooth a cloud or to pin it down: Guarantees and Insights on Score Matching in Denoising Diffusion Models (2305.09605v3)

Published 16 May 2023 in stat.ML and cs.LG

Abstract: Denoising diffusion models are a class of generative models which have recently achieved state-of-the-art results across many domains. Gradual noise is added to the data using a diffusion process, which transforms the data distribution into a Gaussian. Samples from the generative model are then obtained by simulating an approximation of the time reversal of this diffusion initialized by Gaussian samples. Recent research has explored adapting diffusion models for sampling and inference tasks. In this paper, we leverage known connections to stochastic control akin to the F\"oLLMer drift to extend established neural network approximation results for the F\"oLLMer drift to denoising diffusion models and samplers.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Francisco Vargas (25 papers)
  2. Teodora Reu (5 papers)
  3. Anna Kerekes (4 papers)
  4. Michael M Bronstein (3 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets