Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Taming Score-Based Diffusion Priors for Infinite-Dimensional Nonlinear Inverse Problems (2405.15676v1)

Published 24 May 2024 in stat.ML, cs.LG, cs.NA, and math.NA

Abstract: This work introduces a sampling method capable of solving Bayesian inverse problems in function space. It does not assume the log-concavity of the likelihood, meaning that it is compatible with nonlinear inverse problems. The method leverages the recently defined infinite-dimensional score-based diffusion models as a learning-based prior, while enabling provable posterior sampling through a Langevin-type MCMC algorithm defined on function spaces. A novel convergence analysis is conducted, inspired by the fixed-point methods established for traditional regularization-by-denoising algorithms and compatible with weighted annealing. The obtained convergence bound explicitly depends on the approximation error of the score; a well-approximated score is essential to obtain a well-approximated posterior. Stylized and PDE-based examples are provided, demonstrating the validity of our convergence analysis. We conclude by presenting a discussion of the method's challenges related to learning the score and computational complexity.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (116)
  1. A. Tarantola. Inverse problem theory and methods for model parameter estimation. SIAM, 2005.
  2. J. Hadamard. Lectures on Cauchy’s problem in linear partial differential equations. Courier Corporation, 2003.
  3. Linear inverse problems for generalised random variables. Inverse Problems, 5(4):599, 1989.
  4. A. M. Stuart. Uncertainty quantification in bayesian inversion. ICM2014. Invited Lecture, 1279, 2014.
  5. A. M. Stuart. Inverse problems: a bayesian perspective. Acta numerica, 19:451–559, 2010.
  6. Bayesian inverse problems with gaussian priors. 2011.
  7. Uncertainty quantification and weak approximation of an elliptic inverse problem. SIAM Journal on Numerical Analysis, 49(6):2524–2542, 2011.
  8. Deep unsupervised learning using nonequilibrium thermodynamics. In International conference on machine learning, pages 2256–2265. PMLR, 2015.
  9. Denoising diffusion probabilistic models. Advances in neural information processing systems, 33:6840–6851, 2020.
  10. Diffusion models beat gans on image synthesis. Advances in neural information processing systems, 34:8780–8794, 2021.
  11. Photorealistic text-to-image diffusion models with deep language understanding. Advances in neural information processing systems, 35:36479–36494, 2022.
  12. Video diffusion models. Advances in Neural Information Processing Systems, 35:8633–8646, 2022.
  13. Diffusion probabilistic modeling for video generation. Entropy, 25(10):1469, 2023a.
  14. Geodiff: A geometric diffusion model for molecular conformation generation. arXiv preprint arXiv:2203.02923, 2022.
  15. Diffusion models: A comprehensive survey of methods and applications. ACM Computing Surveys, 56(4):1–39, 2023b.
  16. Score-based generative modeling through stochastic differential equations. arXiv preprint arXiv:2011.13456, 2020.
  17. B. D. Anderson. Reverse-time diffusion equation models. Stochastic Processes and their Applications, 12(3):313–326, 1982.
  18. P. Vincent. A connection between score matching and denoising autoencoders. Neural computation, 23(7):1661–1674, 2011.
  19. Generative modeling by estimating gradients of the data distribution. Advances in neural information processing systems, 32, 2019.
  20. Improved techniques for training score-based generative models. Advances in neural information processing systems, 33:12438–12448, 2020.
  21. Solving inverse problems in medical imaging with score-based generative models. arXiv preprint arXiv:2111.08005, 2021.
  22. Score-based diffusion models as principled priors for inverse imaging. In Proceedings of the IEEE/CVF International Conference on Computer Vision, pages 10520–10531, 2023.
  23. Diffusion posterior sampling for general noisy inverse problems. arXiv preprint arXiv:2209.14687, 2022.
  24. Pseudoinverse-guided diffusion models for inverse problems. In International Conference on Learning Representations, 2022.
  25. Efficient bayesian computational imaging with a surrogate score-based prior. arXiv preprint arXiv:2309.01949, 2023.
  26. Conditional image generation with score-based diffusion models. arXiv preprint arXiv:2111.13606, 2021.
  27. Robust compressed sensing mri with deep generative priors. Advances in Neural Information Processing Systems, 34:14938–14954, 2021.
  28. Provably robust score-based diffusion posterior sampling for plug-and-play image reconstruction. arXiv preprint arXiv:2403.17042, 2024.
  29. Snips: Solving noisy inverse problems stochastically. Advances in Neural Information Processing Systems, 34:21757–21769, 2021.
  30. Provable probabilistic imaging using score-based generative priors. arXiv preprint arXiv:2310.10835, 2023.
  31. The inverse problem for the local geodesic ray transform. Inventiones mathematicae, 205(1):83–120, 2016.
  32. A. Dynin. Inversion problem for singular integral operators: C*-approach. Proceedings of the National Academy of Sciences, 75(10):4668–4670, 1978.
  33. Generative modeling with denoising auto-encoders and langevin sampling. arXiv preprint arXiv:2002.00107, 2020.
  34. Sampling is as easy as learning the score: theory for diffusion models with minimal data assumptions. arXiv preprint arXiv:2209.11215, 2022.
  35. Time reversal of infinite-dimensional diffusions. Stochastic processes and their applications, 22(1):59–77, 1986.
  36. Time reversal for infinite-dimensional diffusions. Probability theory and related fields, 82(3):315–347, 1989.
  37. G. Da Prato. An introduction to infinite-dimensional analysis. Springer Science & Business Media, 2006.
  38. Stochastic equations in infinite dimensions. Cambridge university press, 2014.
  39. Diffusion generative models in infinite dimensions. arXiv preprint arXiv:2212.00886, 2022.
  40. Score-based diffusion models in function space. arXiv preprint arXiv:2302.07400, 2023.
  41. Continuous-time functional diffusion processes. Advances in Neural Information Processing Systems, 36, 2024.
  42. Infinite-dimensional diffusion models for function spaces. arXiv preprint arXiv:2302.10130, 2023.
  43. Multilevel diffusion: Infinite dimensional score-based diffusion models for image generation. arXiv preprint arXiv:2303.04772, 2023.
  44. ∞\infty∞-diff: Infinite resolution diffusion with subsampled mollified states. arXiv preprint arXiv:2303.18242, 2023.
  45. Score-based generative modeling through stochastic evolution equations in hilbert spaces. Advances in Neural Information Processing Systems, 36, 2024.
  46. Conditional score-based diffusion models for bayesian inference in infinite dimensions. Advances in Neural Information Processing Systems, 36, 2024.
  47. Conditional optimal transport on function spaces. arXiv preprint arXiv:2311.05672, 2023.
  48. A. P. Calderón. On an inverse boundary value problem. Computational & Applied Mathematics, 25:133–138, 2006.
  49. L. Borcea. Electrical impedance tomography. Inverse problems, 18(6):R99, 2002.
  50. G. Uhlmann. Electrical impedance tomography and calderón’s problem. Inverse problems, 25(12):123011, 2009.
  51. Data assimilation. Cham, Switzerland: Springer, 214:52, 2015.
  52. Multi-source quantitative photoacoustic tomography in a diffusive regime. Inverse Problems, 27(7):075003, 2011.
  53. Inverse diffusion theory of photoacoustics. Inverse Problems, 26(8):085010, 2010.
  54. Inverse boundary spectral problems. Chapman and Hall/CRC, 2001.
  55. Exponential convergence of langevin distributions and their discrete approximations. Bernoulli, pages 341–363, 1996.
  56. Optimization by simulated annealing. science, 220(4598):671–680, 1983.
  57. R. M. Neal. Annealed importance sampling. Statistics and computing, 11:125–139, 2001.
  58. Geometric mcmc for infinite-dimensional inverse problems. Journal of Computational Physics, 335:327–351, 2017.
  59. Infinite dimensional adaptive mcmc for gaussian processes. arXiv preprint arXiv:1804.04859, 2018.
  60. High-dimensional bayesian inference via the unadjusted langevin algorithm. 2019.
  61. Nonasymptotic convergence analysis for the unadjusted langevin algorithm. 2017.
  62. A. S. Dalalyan. Theoretical guarantees for approximate sampling from smooth and log-concave densities. Journal of the Royal Statistical Society Series B: Statistical Methodology, 79(3):651–676, 2017.
  63. Spectral gaps for a metropolis–hastings algorithm in infinite dimensions. 2014.
  64. Mcmc methods for functions: modifying old algorithms to make them faster. 2013.
  65. Dimension-independent likelihood-informed mcmc. Journal of Computational Physics, 304:109–137, 2016.
  66. Multilevel dimension-independent likelihood-informed mcmc for large-scale inverse problems. Inverse Problems, 40(3):035005, 2024.
  67. Multilevel sequential monte carlo with dimension-independent likelihood-informed proposals. SIAM/ASA Journal on Uncertainty Quantification, 6(2):762–786, 2018.
  68. Localization for mcmc: sampling high-dimensional posterior distributions with local structure. Journal of Computational Physics, 380:1–28, 2019.
  69. Dimension-free convergence rates for gradient langevin dynamics in rkhs. In Conference on Learning Theory, pages 1356–1420. PMLR, 2022.
  70. Mcmc methods for diffusion bridges. Stochastics and Dynamics, 8(03):319–350, 2008.
  71. Regularization by denoising: Clarifications and new interpretations. IEEE transactions on computational imaging, 5(1):52–67, 2018.
  72. The little engine that could: Regularization by denoising (red). SIAM Journal on Imaging Sciences, 10(4):1804–1844, 2017.
  73. Plug-and-play unplugged: Optimization-free reconstruction using consensus equilibrium. SIAM Journal on Imaging Sciences, 11(3):2001–2020, 2018.
  74. Plug-and-play priors for model based reconstruction. In 2013 IEEE global conference on signal and information processing, pages 945–948. IEEE, 2013.
  75. What regularized auto-encoders learn from the data-generating distribution. The Journal of Machine Learning Research, 15(1):3563–3593, 2014.
  76. Agem: Solving linear inverse problems via deep priors and sampling. Advances in Neural Information Processing Systems, 32, 2019.
  77. Stochastic solutions for linear inverse problems using the prior implicit in a denoiser. Advances in Neural Information Processing Systems, 34:13242–13254, 2021.
  78. Bayesian imaging using plug & play priors: when langevin meets tweedie. SIAM Journal on Imaging Sciences, 15(2):701–737, 2022.
  79. On polynomial-time computation of high-dimensional posterior measures by langevin-type algorithms. Journal of the European Mathematical Society, 2022.
  80. Bernstein–von mises theorems for statistical inverse problems ii: Compound poisson processes. 2019.
  81. R. Nickl. Bernstein–von mises theorems for statistical inverse problems i: Schrödinger equation. Journal of the European Mathematical Society, 22(8):2697–2750, 2020.
  82. K. Abraham. Nonparametric bayesian posterior contraction rates for scalar diffusions with high-frequency data. 2019.
  83. Consistency of the bayes method for the inverse scattering problem. Inverse Problems, 40(5):055001, 2024.
  84. Consistency of bayesian inference with gaussian process priors in an elliptic inverse problem. Inverse Problems, 36(8):085001, 2020.
  85. On log-concave approximations of high-dimensional posterior measures and stability properties in non-linear inverse problems. arXiv preprint arXiv:2105.07835, 2021.
  86. The attenuated ray transform for connections and higgs fields. Geometric and functional analysis, 22(5):1460–1489, 2012.
  87. Consistent inversion of noisy non-abelian x-ray transforms. Communications on Pure and Applied Mathematics, 74(5):1045–1099, 2021a.
  88. Diffusion coefficients estimation for elliptic partial differential equations. SIAM Journal on Mathematical Analysis, 49(2):1570–1592, 2017.
  89. On some information-theoretic aspects of non-linear statistical inverse problems. 2021.
  90. R. Vershynin. High-dimensional probability: An introduction with applications in data science, volume 47. Cambridge university press, 2018.
  91. Convergence rates for penalized least squares estimators in pde constrained regression problems. SIAM/ASA Journal on Uncertainty Quantification, 8(1):374–413, 2020.
  92. Nonparametric bayesian posterior contraction rates for discretely observed scalar diffusions. 2017.
  93. Statistical guarantees for bayesian uncertainty quantification in nonlinear inverse problems with gaussian process priors. The Annals of Statistics, 49(6):3255–3298, 2021b.
  94. V. Spokoiny. Bayesian inference for nonlinear inverse problems. arXiv preprint arXiv:1912.12694, 2019.
  95. R. Nickl. Bayesian non-linear statistical inverse problems. EMS press, 2023.
  96. Diffusion schrödinger bridge with applications to score-based generative modeling. Advances in Neural Information Processing Systems, 34:17695–17709, 2021.
  97. A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM journal on imaging sciences, 2(1):183–202, 2009a.
  98. Distributed optimization and statistical learning via the alternating direction method of multipliers. Foundations and Trends® in Machine learning, 3(1):1–122, 2011.
  99. Fast gradient-based algorithms for constrained total variation image denoising and deblurring problems. IEEE transactions on image processing, 18(11):2419–2434, 2009b.
  100. Gradient flows: in metric spaces and in the space of probability measures. Springer Science & Business Media, 2005.
  101. Soft truncation: A universal training technique of score-based diffusion model for high precision score estimation. In International Conference on Machine Learning, pages 11201–11228. PMLR, 2022.
  102. Preconditioned Stochastic Gradient Langevin Dynamics for Deep Neural Networks. In Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence, pages 1788–1794, 2016.
  103. Devito (V3.1.0): An embedded domain-specific language for finite differences and geophysical exploration. Geoscientific Model Development, 12(3):1165–1187, 2019.
  104. Architecture and performance of Devito, a system for automated stencil computation. ACM Transactions on Mathematical Software, 46(1), 2020.
  105. An n-dimensional Rosenbrock distribution for Markov chain Monte Carlo testing. Scandinavian Journal of Statistics, 49(2):657–680, 2022.
  106. Building Complex Synthetic Models to Evaluate Acquisition Geometries and Velocity Inversion Technologies. In 74th EAGE Conference and Exhibition. Extended Abstracts, 2012.
  107. A. Tarantola. Inversion of seismic reflection data in the acoustic approximation. Geophysics, 49(8):1259–1266, 1984.
  108. An overview of full-waveform inversion in exploration geophysics. Geophysics, 74(6):WCC1–WCC26, 2009.
  109. Iterative asymptotic inversion in the acoustic approximation. Geophysics, 57(9):1138–1154, 1992.
  110. Least-squares migration of incomplete reflection data. Geophysics, 64(1):208–221, 1999.
  111. Constraints versus penalties for edge-preserving full-waveform inversion. The Leading Edge, 36(1):94–100, 2017.
  112. Nonlinear total variation based noise removal algorithms. Physica D: Nonlinear Phenomena, 60(1):259–268, 1992.
  113. Consistency and fluctuations for stochastic gradient Langevin dynamics. The Journal of Machine Learning Research, 17(1):193–225, 2016.
  114. Bayesian learning via stochastic gradient Langevin dynamics. In Proceedings of the 28th International Conference on Machine Learning, pages 681–688, 2011.
  115. Towards a theory of non-log-concave sampling: first-order stationarity guarantees for langevin monte carlo. In Conference on Learning Theory, pages 2896–2923. PMLR, 2022.
  116. Rapid convergence of the unadjusted langevin algorithm: Isoperimetry suffices. Advances in neural information processing systems, 32, 2019.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com