Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

GradCheck: Analyzing classifier guidance gradients for conditional diffusion sampling (2406.17399v1)

Published 25 Jun 2024 in cs.LG

Abstract: To sample from an unconditionally trained Denoising Diffusion Probabilistic Model (DDPM), classifier guidance adds conditional information during sampling, but the gradients from classifiers, especially those not trained on noisy images, are often unstable. This study conducts a gradient analysis comparing robust and non-robust classifiers, as well as multiple gradient stabilization techniques. Experimental results demonstrate that these techniques significantly improve the quality of class-conditional samples for non-robust classifiers by providing more stable and informative classifier guidance gradients. The findings highlight the importance of gradient stability in enhancing the performance of classifier guidance, especially on non-robust classifiers.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Philipp Vaeth (3 papers)
  2. Alexander M. Fruehwald (3 papers)
  3. Magda Gregorova (16 papers)
  4. Benjamin Paassen (10 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.