Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

U-Turn Diffusion (2308.07421v2)

Published 14 Aug 2023 in cs.LG and cs.CV

Abstract: We explore diffusion models of AI, which consist of a forward noise-injecting process and a reverse de-noising process, to understand how they encode information about the Ground Truth (GT) samples in the score-function. Our observations indicate that the most essential information is stored primarily during the early stages of the forward process. Consequently, we propose the U-turn diffusion model, which modifies the traditional approach by shortening the duration of both the forward process and the subsequent reverse dynamics, starting from the final configuration of the forward process. To determine the optimal moment for the U-turn, ensuring that synthetic samples generated at the end of the reverse process are independently and identically distributed (i.i.d.) according to the probability distribution implicitly represented by the GT samples, we utilize various analytical tools, including auto-correlation analysis and the Kolmogorov-Smirnov Gaussianity test. Our experiments with the ImageNet demonstrate that the U-turn diffusion model achieves state-of-the-art Fr\'echet Inception Distance scores with fewer Neural Function Evaluations. Notably, we achieve a 1.35-fold speed-up in inference without the need for retraining.

Citations (2)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com