Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Provably Robust Score-Based Diffusion Posterior Sampling for Plug-and-Play Image Reconstruction (2403.17042v2)

Published 25 Mar 2024 in eess.IV, cs.CV, cs.LG, eess.SP, math.OC, and stat.ML

Abstract: In a great number of tasks in science and engineering, the goal is to infer an unknown image from a small number of measurements collected from a known forward model describing certain sensing or imaging modality. Due to resource constraints, this task is often extremely ill-posed, which necessitates the adoption of expressive prior information to regularize the solution space. Score-based diffusion models, due to its impressive empirical success, have emerged as an appealing candidate of an expressive prior in image reconstruction. In order to accommodate diverse tasks at once, it is of great interest to develop efficient, consistent and robust algorithms that incorporate unconditional score functions of an image prior distribution in conjunction with flexible choices of forward models. This work develops an algorithmic framework for employing score-based diffusion models as an expressive data prior in general nonlinear inverse problems. Motivated by the plug-and-play framework in the imaging community, we introduce a diffusion plug-and-play method (DPnP) that alternatively calls two samplers, a proximal consistency sampler based solely on the likelihood function of the forward model, and a denoising diffusion sampler based solely on the score functions of the image prior. The key insight is that denoising under white Gaussian noise can be solved rigorously via both stochastic (i.e., DDPM-type) and deterministic (i.e., DDIM-type) samplers using the unconditional score functions. We establish both asymptotic and non-asymptotic performance guarantees of DPnP, and provide numerical experiments to illustrate its promise in solving both linear and nonlinear image reconstruction tasks. To the best of our knowledge, DPnP is the first provably-robust posterior sampling method for nonlinear inverse problems using unconditional diffusion priors.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (81)
  1. Faster high-accuracy log-concave sampling via algorithmic warm starts. In 2023 IEEE 64th Annual Symposium on Foundations of Computer Science (FOCS), pages 2169–2176. IEEE.
  2. Gradient flows: in metric spaces and in the space of probability measures. Springer Science & Business Media.
  3. Anderson, B. D. (1982). Reverse-time diffusion equation models. Stochastic Processes and their Applications, 12(3):313–326.
  4. Nearly d𝑑ditalic_d-linear convergence bounds for diffusion models via stochastic localization. In The Twelfth International Conference on Learning Representations.
  5. Compressed sensing using generative models. In International conference on machine learning, pages 537–546. PMLR.
  6. Generative plug and play: Posterior sampling for inverse problems. arXiv preprint arXiv:2306.07233.
  7. Bourbaki, N. (2023a). Théories spectrales Chapitres 1 et 2. Springer.
  8. Bourbaki, N. (2023b). Théories spectrales: Chapitres 3 à 5. Springer Nature.
  9. Plug-and-play unplugged: Optimization-free reconstruction using consensus equilibrium. SIAM Journal on Imaging Sciences, 11(3):2001–2020.
  10. Exact matrix completion via convex optimization. Communications of the ACM, 55(6):111–119.
  11. Monte carlo guided diffusion for Bayesian linear inverse problems. arXiv preprint arXiv:2308.07983.
  12. Improved analysis of score-based generative modeling: User-friendly bounds under minimal smoothness assumptions. In International Conference on Machine Learning, pages 4735–4763.
  13. The probability flow ODE is provably fast. Neural Information Processing Systems.
  14. Sampling is as easy as learning the score: theory for diffusion models with minimal data assumptions. arXiv preprint arXiv:2209.11215.
  15. Restoration-degradation beyond linear diffusions: A non-asymptotic analysis for DDIM-type samplers. In International Conference on Machine Learning, pages 4462–4484.
  16. Optimal dimension dependence of the metropolis-adjusted langevin algorithm. In Conference on Learning Theory, pages 1260–1300. PMLR.
  17. Diffusion posterior sampling for general noisy inverse problems. In International Conference on Learning Representations.
  18. Fast diffusion sampler for inverse problems by geometric decomposition. arXiv preprint arXiv:2303.05754.
  19. Plug-and-play split Gibbs sampler: embedding deep generative priors in Bayesian inference. arXiv preprint arXiv:2304.11134.
  20. Donoho, D. L. (2006). Compressed sensing. IEEE Transactions on Information Theory, 52(4):1289–1306.
  21. Doob, J. L. (1942). The Brownian movement and stochastic equations. Annals of Mathematics, 43(2):351–369.
  22. Diffusion posterior sampling for linear inverse problem solving: A filtering perspective. In International Conference on Learning Representations.
  23. Drusvyatskiy, D. (2017). The proximal point method revisited. arXiv preprint arXiv:1712.06038.
  24. Efron, B. (2011). Tweedie’s formula and selection bias. Journal of the American Statistical Association, 106(496):1602–1614.
  25. Evans, L. C. (2012). An introduction to stochastic differential equations, volume 82. American Mathematical Soc.
  26. What’s in a prior? Learned proximal networks for inverse problems. In The Twelfth International Conference on Learning Representations.
  27. Score-based diffusion models as principled priors for inverse imaging. In 2023 IEEE/CVF International Conference on Computer Vision (ICCV), pages 10486–10497.
  28. Diffusion models as plug-and-play priors. Advances in Neural Information Processing Systems, 35:14715–14728.
  29. Learning fast approximations of sparse coding. In Proceedings of the 27th international conference on international conference on machine learning, pages 399–406.
  30. Diffusion posterior sampling is computationally intractable. arXiv preprint arXiv:2402.12727.
  31. Denoising diffusion probabilistic models. Advances in Neural Information Processing Systems, 33:6840–6851.
  32. Super-resolution image reconstruction for high-density three-dimensional single-molecule microscopy. IEEE Transactions on Computational Imaging, 3(4):763–773.
  33. Hyvärinen, A. (2005). Estimation of non-normalized statistical models by score matching. Journal of Machine Learning Research, 6(4).
  34. Denoising diffusion restoration models. Advances in Neural Information Processing Systems, 35:23593–23606.
  35. Stochastic image denoising by sampling from the posterior distribution. In Proceedings of the IEEE/CVF International Conference on Computer Vision, pages 1866–1875.
  36. Statistical efficiency of score matching: The view from isoperimetry. International Conference on Learning Representations.
  37. Bayesian imaging using plug & play priors: when Langevin meets Tweedie. SIAM Journal on Imaging Sciences, 15(2):701–737.
  38. Structured logconcave sampling with a restricted Gaussian oracle. In Conference on Learning Theory, pages 2993–3050. PMLR.
  39. Accelerating convergence of score-based diffusion models, provably. arXiv preprint arXiv:2403.03852.
  40. Towards a mathematical theory for consistency training in diffusion models. arXiv preprint arXiv:2402.07802.
  41. Towards faster non-asymptotic convergence for diffusion-based generative models. arXiv preprint arXiv:2306.09251.
  42. DPM-Solver: A fast ODE solver for diffusion probabilistic model sampling in around 10 steps. Advances in Neural Information Processing Systems, 35:5775–5787.
  43. Sparse MRI: The application of compressed sensing for rapid MR imaging. Magnetic Resonance in Medicine: An Official Journal of the International Society for Magnetic Resonance in Medicine, 58(6):1182–1195.
  44. Is there an analog of Nesterov acceleration for gradient-based MCMC?
  45. Nonconvex sampling with the metropolis-adjusted langevin algorithm. In Conference on learning theory, pages 2259–2293. PMLR.
  46. A variational perspective on solving inverse problems with diffusion models. arXiv preprint arXiv:2305.04391.
  47. Algorithm unrolling: Interpretable, efficient deep learning for signal and image processing. IEEE Signal Processing Magazine, 38(2):18–44.
  48. Provable benefits of score matching. Advances in Neural Information Processing Systems, 36.
  49. Proximal algorithms. Foundations and trends® in Optimization, 1(3):127–239.
  50. Information theory: From coding to learning. Cambridge university press.
  51. Sparsity and compressed sensing in radar imaging. Proceedings of the IEEE, 98(6):1006–1020.
  52. Hierarchical text-conditional image generation with CLIP latents. arXiv preprint arXiv:2204.06125.
  53. Regularization by denoising: Clarifications and new interpretations. IEEE transactions on computational imaging, 5(1):52–67.
  54. Optimal scaling of discrete approximations to Langevin diffusions. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 60(1):255–268.
  55. The little engine that could: Regularization by denoising (RED). SIAM Journal on Imaging Sciences, 10(4):1804–1844.
  56. High-resolution image synthesis with latent diffusion models. In IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 10684–10695.
  57. Photorealistic text-to-image diffusion models with deep language understanding. Advances in Neural Information Processing Systems, 35:36479–36494.
  58. Saloff-Coste, L. (1997). Lectures on finite Markov chains, pages 301–413. Springer Berlin Heidelberg, Berlin, Heidelberg.
  59. Schaefer, H. (2012). Banach Lattices and Positive Operators, volume 215. Springer Science & Business Media.
  60. Phase retrieval with application to optical imaging: a contemporary overview. IEEE signal processing magazine, 32(3):87–109.
  61. Deep unsupervised learning using nonequilibrium thermodynamics. In International Conference on Machine Learning, pages 2256–2265.
  62. Solving inverse problems with latent diffusion models via hard data consistency. arXiv preprint arXiv:2307.08123.
  63. Denoising diffusion implicit models. In International Conference on Learning Representations.
  64. Pseudoinverse-guided diffusion models for inverse problems. In International Conference on Learning Representations.
  65. Loss-guided diffusion models for plug-and-play controllable generation. In International Conference on Machine Learning, pages 32483–32498. PMLR.
  66. Generative modeling by estimating gradients of the data distribution. Advances in Neural Information Processing Systems, 32.
  67. Solving inverse problems in medical imaging with score-based generative models. In International Conference on Learning Representations.
  68. Score-based generative modeling through stochastic differential equations. International Conference on Learning Representations.
  69. Provable probabilistic imaging using score-based generative priors. arXiv preprint arXiv:2310.10835.
  70. Contractive diffusion probabilistic models. arXiv preprint arXiv:2401.13115.
  71. Score-based diffusion models via stochastic differential equations–a technical tutorial. arXiv preprint arXiv:2402.07487.
  72. Tierney, L. (1994). Markov Chains for Exploring Posterior Distributions. The Annals of Statistics, 22(4):1701 – 1728.
  73. Diffusion probabilistic modeling of protein backbones in 3d for the motif-scaffolding problem. arXiv preprint arXiv:2206.04119.
  74. Deep image prior. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 9446–9454.
  75. Plug-and-play priors for model based reconstruction. In IEEE Global Conference on Signal and Information Processing, pages 945–948. IEEE.
  76. Vincent, P. (2011). A connection between score matching and denoising autoencoders. Neural computation, 23(7):1661–1674.
  77. Split-and-augmented Gibbs sampler-application to large-scale inference problems. IEEE Transactions on Signal Processing, 67(6):1648–1661.
  78. Efficient MCMC sampling with dimension-free convergence rate using ADMM-type splitting. Journal of Machine Learning Research, 23(25):1–69.
  79. Wibisono, A. (2018). Sampling as optimization in the space of measures: The Langevin dynamics as a composite optimization problem. In Conference on Learning Theory, pages 2093–3027. PMLR.
  80. Practical and asymptotically exact conditional sampling in diffusion models. Advances in Neural Information Processing Systems, 36.
  81. Fast sampling of diffusion models with exponential integrator. In The Eleventh International Conference on Learning Representations.
Citations (9)

Summary

We haven't generated a summary for this paper yet.