Papers
Topics
Authors
Recent
2000 character limit reached

Tweedie Moment Projected Diffusions For Inverse Problems (2310.06721v3)

Published 10 Oct 2023 in stat.CO

Abstract: Diffusion generative models unlock new possibilities for inverse problems as they allow for the incorporation of strong empirical priors in scientific inference. Recently, diffusion models are repurposed for solving inverse problems using Gaussian approximations to conditional densities of the reverse process via Tweedie's formula to parameterise the mean, complemented with various heuristics. To address various challenges arising from these approximations, we leverage higher order information using Tweedie's formula and obtain a statistically principled approximation. We further provide a theoretical guarantee specifically for posterior sampling which can lead to a better theoretical understanding of diffusion-based conditional sampling. Finally, we illustrate the empirical effectiveness of our approach for general linear inverse problems on toy synthetic examples as well as image restoration. We show that our method (i) removes any time-dependent step-size hyperparameters required by earlier methods, (ii) brings stability and better sample quality across multiple noise levels, (iii) is the only method that works in a stable way with variance exploding (VE) forward processes as opposed to earlier works.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (38)
  1. Conditional image generation with score-based diffusion models. arXiv preprint arXiv:2111.13606, 2021.
  2. Christopher M Bishop. Pattern recognition and machine learning, volume 4. Springer, 2006.
  3. Compressed sensing using generative models. In International conference on machine learning, pp.  537–546. PMLR, 2017.
  4. Monte carlo guided diffusion for bayesian linear inverse problems, 2023.
  5. Sampling is as easy as learning the score: theory for diffusion models with minimal data assumptions. In International Conference on Learning Representations, 2023.
  6. Diffusion posterior sampling for general noisy inverse problems. In The Eleventh International Conference on Learning Representations, 2022a.
  7. Improving diffusion models for inverse problems using manifold constraints. Advances in Neural Information Processing Systems, 35:25683–25696, 2022b.
  8. Parallel diffusion models of operator and image for blind inverse problems. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp.  6059–6069, 2023.
  9. Score-guided intermediate level optimization: Fast langevin mixing for inverse problems. In Proceedings of the 39th International Conference on Machine Learning (ICML), 2022.
  10. Diffusion models beat gans on image synthesis. Advances in neural information processing systems, 34:8780–8794, 2021.
  11. Bradley Efron. Tweedie’s formula and selection bias. Journal of the American Statistical Association, 106(496):1602–1614, 2011. doi: 10.1198/jasa.2011.tm11181. URL https://doi.org/10.1198/jasa.2011.tm11181. PMID: 22505788.
  12. Efficient bayesian computational imaging with a surrogate score-based prior. arXiv preprint arXiv:2309.01949, 2023.
  13. Score-based diffusion models as principled priors for inverse imaging. arXiv preprint arXiv:2304.11751, 2023.
  14. User-defined event sampling and uncertainty quantification in diffusion models for physical dynamical systems. In International Conference on Machine Learning, pp.  10136–10152. PMLR, 2023.
  15. A class of Wasserstein metrics for probability distributions. Michigan Mathematical Journal, 31(2):231 – 240, 1984. doi: 10.1307/mmj/1029003026. URL https://doi.org/10.1307/mmj/1029003026.
  16. Denoising diffusion probabilistic models, 2020.
  17. Aapo Hyvärinen. Estimation of non-normalized statistical models by score matching. Journal of Machine Learning Research, 6(24):695–709, 2005. URL http://jmlr.org/papers/v6/hyvarinen05a.html.
  18. Robust compressed sensing mri with deep generative priors. Advances in Neural Information Processing Systems, 34:14938–14954, 2021.
  19. Stochastic solutions for linear inverse problems using the prior implicit in a denoiser. Advances in Neural Information Processing Systems, 34:13242–13254, 2021.
  20. Brownian motion and stochastic calculus, volume 113. Springer Science & Business Media, 2012.
  21. A style-based generator architecture for generative adversarial networks. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp.  4401–4410, 2019.
  22. Snips: Solving noisy inverse problems stochastically. Advances in Neural Information Processing Systems, 34:21757–21769, 2021.
  23. Denoising diffusion restoration models. Advances in Neural Information Processing Systems, 35:23593–23606, 2022.
  24. Learning multiple layers of features from tiny images. 2009.
  25. Bayesian imaging using plug & play priors: when Langevin meets Tweedie. SIAM Journal on Imaging Sciences, 15(2):701–737, 2022.
  26. A variational perspective on solving inverse problems with diffusion models. arXiv preprint arXiv:2305.04391, 2023.
  27. Estimating high order gradients of the data distribution by denoising. Advances in Neural Information Processing Systems, 34:25359–25369, 2021.
  28. Monte carlo gradient estimation in machine learning. The Journal of Machine Learning Research, 21(1):5183–5244, 2020.
  29. Infinite-dimensional diffusion models for function spaces. arXiv preprint arXiv:2302.10130, 2023.
  30. Herbert E. Robbins. An Empirical Bayes Approach to Statistics, pp.  388–394. Springer New York, New York, NY, 1992. ISBN 978-1-4612-0919-5. doi: 10.1007/978-1-4612-0919-5˙26. URL https://doi.org/10.1007/978-1-4612-0919-5_26.
  31. Maximum likelihood estimation of signal amplitude and noise variance from mr data. Magnetic Resonance in Medicine, 51(3):586–594, 2004. doi: https://doi.org/10.1002/mrm.10728. URL https://onlinelibrary.wiley.com/doi/abs/10.1002/mrm.10728.
  32. Denoising diffusion implicit models. 2021a. URL https://openreview.net/forum?id=St1giarCHLP.
  33. Pseudoinverse-guided diffusion models for inverse problems. In International Conference on Learning Representations, 2023. URL https://openreview.net/forum?id=9_gsMA8MRKQ.
  34. Score-based generative modeling through stochastic differential equations. arXiv preprint arXiv:2011.13456, 2020.
  35. Solving inverse problems in medical imaging with score-based generative models. arXiv preprint arXiv:2111.08005, 2021b.
  36. Removing structured noise with diffusion models. arXiv preprint arXiv:2302.05290, 2023.
  37. Andrew M Stuart. Inverse problems: a bayesian perspective. Acta numerica, 19:451–559, 2010.
  38. Albert Tarantola. Inverse problem theory and methods for model parameter estimation. SIAM, 2005.
Citations (17)

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Paper to Video (Beta)

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.