Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Randomized Physics-Informed Machine Learning for Uncertainty Quantification in High-Dimensional Inverse Problems (2312.06177v2)

Published 11 Dec 2023 in cs.LG

Abstract: We propose a physics-informed machine learning method for uncertainty quantification in high-dimensional inverse problems. In this method, the states and parameters of partial differential equations (PDEs) are approximated with truncated conditional Karhunen-Lo`eve expansions (CKLEs), which, by construction, match the measurements of the respective variables. The maximum a posteriori (MAP) solution of the inverse problem is formulated as a minimization problem over CKLE coefficients where the loss function is the sum of the norm of PDE residuals and the $\ell_2$ regularization term. This MAP formulation is known as the physics-informed CKLE (PICKLE) method. Uncertainty in the inverse solution is quantified in terms of the posterior distribution of CKLE coefficients, and we sample the posterior by solving a randomized PICKLE minimization problem, formulated by adding zero-mean Gaussian perturbations in the PICKLE loss function. We call the proposed approach the randomized PICKLE (rPICKLE) method. For linear and low-dimensional nonlinear problems (15 CKLE parameters), we show analytically and through comparison with Hamiltonian Monte Carlo (HMC) that the rPICKLE posterior converges to the true posterior given by the Bayes rule. For high-dimensional non-linear problems with 2000 CKLE parameters, we numerically demonstrate that rPICKLE posteriors are highly informative--they provide mean estimates with an accuracy comparable to the estimates given by the MAP solution and the confidence interval that mostly covers the reference solution. We are not able to obtain the HMC posterior to validate rPICKLE's convergence to the true posterior due to the HMC's prohibitive computational cost for the considered high-dimensional problems. Our results demonstrate the advantages of rPICKLE over HMC for approximately sampling high-dimensional posterior distributions subject to physics constraints.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (39)
  1. Inverse methods in hydrogeology: Evolution and recent trends. Advances in Water Resources, 63:22–37, 2014.
  2. On uncertainty quantification in hydrogeology and hydrogeophysics. Advances in Water Resources, 110:166–181, 2017.
  3. Physics-informed deep neural networks for learning parameters and constitutive relationships in subsurface flow problems. Water Resources Research, 56(5):e2019WR026731, 2020.
  4. Physics-informed neural network method for forward and backward advection-dispersion equations. Water Resources Research, 57(7):e2020WR029479, 2021.
  5. Improved training of physics-informed neural networks for parabolic differential equations with sharply perturbed initial conditions. Computer Methods in Applied Mechanics and Engineering, 414:116125, 2023.
  6. Physics-informed machine learning method for large-scale data assimilation problems. Water Resources Research, 58(5):e2021WR031023, 2022. doi: https://doi.org/10.1029/2021WR031023. URL https://agupubs.onlinelibrary.wiley.com/doi/abs/10.1029/2021WR031023. e2021WR031023 2021WR031023.
  7. Applied groundwater modeling: simulation of flow and advective transport. Academic press, 2015.
  8. Pilot point methodology for automated calibration of an ensemble of conditionally simulated transmissivity fields: 1. theory and computational experiments. Water Resources Research, 31(3):475–493, 1995.
  9. Approaches to highly parameterized inversion: Pilot point theory, guidelines, and research directions, volume 2010. US Department of the Interior, US Geological Survey, 2010.
  10. A hybrid regularized inversion methodology for highly parameterized environmental models. Water Resources Research, 41(10), 2005.
  11. Dimensionality reduction and polynomial chaos acceleration of bayesian inference in inverse problems. Journal of Computational Physics, 228(6):1862–1902, 2009.
  12. Gaussian process regression and conditional polynomial chaos for parameter estimation. Journal of Computational Physics, page 109520, 2020.
  13. Physics-informed machine learning with conditional karhunen-loève expansions. Journal of Computational Physics, 426:109904, 2021.
  14. Andrew M Stuart. Inverse problems: a bayesian perspective. Acta numerica, 19:451–559, 2010.
  15. Predictive uncertainty analysis of a saltwater intrusion model using null-space monte carlo. Water Resources Research, 47(5), 2011.
  16. Parameter estimation and predictive uncertainty in stochastic inverse modeling of groundwater flow: Comparing null-space monte carlo and multiple starting point methods. Water Resources Research, 49(1):536–553, 2013.
  17. Stephen Brooks. Markov chain monte carlo method and its application. Journal of the royal statistical society: series D (the Statistician), 47(1):69–100, 1998.
  18. A review of uncertainty quantification in deep learning: Techniques, applications and challenges. Information Fusion, 76:243–297, 2021.
  19. Physics-constrained bayesian neural network for fluid flow reconstruction with sparse and noisy data. Theoretical and Applied Mechanics Letters, 10(3):161–169, 2020.
  20. Bayesian physics-informed neural networks for seismic tomography based on the eikonal equation. arXiv preprint arXiv:2203.12351, 2022.
  21. Variational inference: A review for statisticians. Journal of the American statistical Association, 112(518):859–877, 2017.
  22. Quality of uncertainty quantification for bayesian neural network inference. arXiv preprint arXiv:1906.09686, 2019.
  23. Radford M Neal. Mcmc using hamiltonian dynamics. Handbook of Markov Chain Monte Carlo, 2(11):2, 2011.
  24. Hamiltonian monte carlo solution of tomographic inverse problems. Geophysical Journal International, 216(2):1344–1363, 2019.
  25. Hamiltonian monte carlo in inverse problems. ill-conditioning and multimodality. International Journal for Uncertainty Quantification, 13(1), 2023.
  26. Michael Betancourt. A conceptual introduction to hamiltonian monte carlo. arXiv preprint arXiv:1701.02434, 2017.
  27. A randomized maximum a posteriori method for posterior sampling of high dimensional nonlinear bayesian inverse problems. SIAM Journal on Scientific Computing, 40(1):A142–A171, 2018.
  28. Ensemble randomized maximum likelihood method as an iterative ensemble smoother. Mathematical Geosciences, 44:1–26, 2012.
  29. Randomize-then-optimize: A method for sampling from posterior distributions in nonlinear inverse problems. SIAM Journal on Scientific Computing, 36(4):A1895–A1910, 2014.
  30. Jeremy T. White. A model-independent iterative ensemble smoother for efficient history-matching and uncertainty quantification in very high dimensions. Environmental Modelling & Software, 109:191–201, 2018. ISSN 1364-8152. doi: https://doi.org/10.1016/j.envsoft.2018.06.009. URL https://www.sciencedirect.com/science/article/pii/S1364815218302676.
  31. Gaussian processes for machine learning, volume 1. Springer, 2006.
  32. Luke Tierney. Markov chains for exploring posterior distributions. the Annals of Statistics, pages 1701–1728, 1994.
  33. Estimation of aquifer parameters under transient and steady state conditions: 2. uniqueness, stability, and solution algorithms. Water Resources Research, 22(2):211–227, 1986. doi: https://doi.org/10.1029/WR022i002p00211. URL https://agupubs.onlinelibrary.wiley.com/doi/abs/10.1029/WR022i002p00211.
  34. Transient inverse calibration of Hanford site-wide groundwater model to Hanford operational impacts-1943 to 1996. Technical report, Pacific Northwest National Laboratory (PNNL), Richland, Washington, United States, 2001.
  35. Uncertainty quantification in scale-dependent models of flow in porous media. Water Resources Research, 53:9392–9401, 2017.
  36. The no-u-turn sampler: adaptively setting path lengths in hamiltonian monte carlo. J. Mach. Learn. Res., 15(1):1593–1623, 2014.
  37. Optimizing the integrator step size for hamiltonian monte carlo, 2015.
  38. Conditioning permeability fields to pressure data. In ECMOR V-5th European conference on the mathematics of oil recovery, pages cp–101. European Association of Geoscientists & Engineers, 1996.
  39. Riemann manifold langevin and hamiltonian monte carlo methods. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 73, 2011. URL https://api.semanticscholar.org/CorpusID:6630595.
Citations (1)

Summary

We haven't generated a summary for this paper yet.