Efficient Denoising using Score Embedding in Score-based Diffusion Models (2404.06661v1)
Abstract: It is well known that training a denoising score-based diffusion models requires tens of thousands of epochs and a substantial number of image data to train the model. In this paper, we propose to increase the efficiency in training score-based diffusion models. Our method allows us to decrease the number of epochs needed to train the diffusion model. We accomplish this by solving the log-density Fokker-Planck (FP) Equation numerically to compute the score \textit{before} training. The pre-computed score is embedded into the image to encourage faster training under slice Wasserstein distance. Consequently, it also allows us to decrease the number of images we need to train the neural network to learn an accurate score. We demonstrate through our numerical experiments the improved performance of our proposed method compared to standard score-based diffusion models. Our proposed method achieves a similar quality to the standard method meaningfully faster.
- Brian D.O. Anderson. Reverse-time diffusion equation models. Stochastic Processes and their Applications, 12(3):313–326, 1982. ISSN 0304-4149. doi:https://doi.org/10.1016/0304-4149(82)90051-5.
- Probability flow solution of the fokker–planck equation. Machine Learning: Science and Technology, 4, 2022.
- Soft diffusion: Score matching for general corruptions. ArXiv, abs/2209.05442, 2022.
- Denoising diffusion probabilistic models. ArXiv, abs/2006.11239, 2020.
- D.P. Kingma and J. Ba. Adam: A method for stochastic optimization, 2017.
- Fp-diffusion: Improving score-based diffusion models by enforcing the underlying score fokker-planck equation. In International Conference on Machine Learning, 2022.
- Dpm-solver: A fast ode solver for diffusion probabilistic model sampling in around 10 steps. ArXiv, abs/2206.00927, 2022.
- Andrew S. Na and Justin W. L. Wan. Efficient pricing and hedging of high-dimensional American options using deep recurrent networks. Quantitative Finance, 23(4):631–651, 2023. doi:10.1080/14697688.2023.2167666. URL https://doi.org/10.1080/14697688.2023.2167666.
- Bernt Øksendal. Stochastic differential equations : an introduction with applications. Journal of the American Statistical Association, 82:948, 1987.
- Efficient learning of generative models via finite-difference score matching. ArXiv, abs/2007.03317, 2020.
- U-net: Convolutional networks for biomedical image segmentation. ArXiv, abs/1505.04597, 2015.
- Yousef Saad. Iterative methods for sparse linear systems. 2003.
- Deep unsupervised learning using nonequilibrium thermodynamics. ArXiv, abs/1503.03585, 2015.
- Denoising diffusion implicit models. ArXiv, abs/2010.02502, 2020a.
- Generative modeling by estimating gradients of the data distribution. In Neural Information Processing Systems, 2019.
- Improved techniques for training score-based generative models. ArXiv, abs/2006.09011, 2020.
- Score-based generative modeling through stochastic differential equations. ArXiv, abs/2011.13456, 2020b.
- Attention is all you need. In I. Guyon, U. Von Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, editors, Advances in Neural Information Processing Systems, volume 30. Curran Associates, Inc., 2017.
- Pascal Vincent. A connection between score matching and denoising autoencoders. Neural Computation, 23(7):1661–1674, 2011. doi:10.1162/NECO_a_00142.
- Image quality assessment: from error visibility to structural similarity. IEEE Transactions on Image Processing, 13(4):600–612, 2004. doi:10.1109/TIP.2003.819861.
Sponsored by Paperpile, the PDF & BibTeX manager trusted by top AI labs.
Get 30 days freePaper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.