On Accelerating Diffusion-Based Sampling Process via Improved Integration Approximation
Abstract: A popular approach to sample a diffusion-based generative model is to solve an ordinary differential equation (ODE). In existing samplers, the coefficients of the ODE solvers are pre-determined by the ODE formulation, the reverse discrete timesteps, and the employed ODE methods. In this paper, we consider accelerating several popular ODE-based sampling processes (including EDM, DDIM, and DPM-Solver) by optimizing certain coefficients via improved integration approximation (IIA). We propose to minimize, for each time step, a mean squared error (MSE) function with respect to the selected coefficients. The MSE is constructed by applying the original ODE solver for a set of fine-grained timesteps, which in principle provides a more accurate integration approximation in predicting the next diffusion state. The proposed IIA technique does not require any change of a pre-trained model, and only introduces a very small computational overhead for solving a number of quadratic optimization problems. Extensive experiments show that considerably better FID scores can be achieved by using IIA-EDM, IIA-DDIM, and IIA-DPM-Solver than the original counterparts when the neural function evaluation (NFE) is small (i.e., less than 25).
- Wasserstein GAN. arXiv:1701.07875 [stat.ML], 2017.
- Computer Methods for Ordinary Differential Equations and Differential-Algebraic Equations. Soceity for Industrial and Applied Mathematics, 1998.
- Estimating the Optimal Covariance with Imperfect Mean in Diffusion Probabilistic Models. In ICML, 2022.
- Analytic-DPM: an Analytic Estimate of the Optimal Reverse Variance in Diffusion Probabilistic Models. In ICLR, 2022.
- C. M. Bishop. Pattern Recognition and Machine Learning. Springer, 2006.
- WaveGrad: Estimating Gradients for Waveform Generation. arXiv:2009.00713, September 2020.
- P. Dhariwal and A. Nichol. Diffusion models beat gans on image synthesis. arXiv:2105.05233 [cs.LG], 2021.
- Generative Adversarial Nets. In Proceedings of the International Conference on Neural Information Processing Systems, pages 2672–2680, 2014.
- Improved training of wasserstein gans. In Advances in neural information processing systems, pages 5767–5777, 2017.
- Denoising diffusion probabilistic models. In NeurIPS, 2020.
- A. Hyvarinen. Estimation of non-normalized statistical models by score matching. Journal of Machine Learning Research, 24:695–709, 2005.
- Elucidating the Design Space of Diffusion-Based Generative Models. In 36th Conference on Nueral Information Processing Systems (NeurIPS), 2022.
- Refining Generative Process with Discriminator Guidance in Score-based Diffusion Models. arXiv preprint arXiv:2211.17091 [cs.CV], 2022.
- Adam: A Method for Stochastic Optimization. arXiv preprint arXiv:1412.6980v9, 2017.
- Variational diffusion models. arXiv: preprint arXiv:2107.00630, 2021.
- BDDM: Bilateral Denoising Diffusion Models for Fast and High-Quality Speech Synthesis. In ICLR, 2022.
- Pseudo Numerical Methods for Diffusion Models on Manifolds. In ICLR, 2022.
- DPM-Solver: A Fast ODE Solver for Diffusion Probabilistic Sampling in Around 10 Steps. In NeurIPS, 2022.
- A. Nichol and P. Dhariwal. Improved denoising diffusion probabilistic models. arXiv preprint arXiv:2102.09672, 2021.
- GLIDE: Towards Photorealistic image generation and editing with text-guided diffusion models. In ICML, 2022.
- B. T. Polyak. Some methods of speeding up the convergence of iteration methods. USSR Computational Mathematics and Mathematical Physics, 4:1–17, 1964.
- High-resolution image synthesis with latent diffusion models. In CVPR, 2022.
- U-Net: Convolutional Networks for Biomedical Image Segmentation. arXiv:1505.04597 [cs.CV], 2015.
- StyleGAN-XL: Scaling StyleGAN to large diverse datasets. In SIGGRAPH, 2022.
- Deep unsupervised learning using nonequilibrium thermodynamics. ICML, 2015.
- Denoising Diffusion Implicit Models. In ICLR, 2021.
- Maximum likelihood training of score-based diffusion models. In Advances in neural information processing systems (NeurIPS), 2021.
- Y. Song and S. Ermon. Generative modeling by estimating gradients of the data distribution. In Advances in neural information processing systems (NeurIPS), page 11895–11907, 2019.
- Score-Based Generative Modeling Through Stochastic Differential Equations. In ICLR, 2021.
- On the importance of initialization and momentum in deep learning. In International conference on Machine Learning (ICML), 2013.
- Diffusion models: A comprehensive survey of methods and applications. arXiv preprint arXiv:2102.09672, 2021.
- G. Zhang. On Suppressing Range of Adaptive Stepsizes of Adam to Improve Generalisation Performance. arXiv:2302.01029 [cs.LG], 2023.
- Lookahead Diffusion Probabilistic Models for Refining Mean Estimation. In Computer Vision and Pattern Recognition (CVPR), 2023.
- Q. Zhang and Y. Chenu. Fast Sampling of Diffusion Models with Exponential Integrator. arXiv:2204.13902 [cs.LG], 2022.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.