Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Sequential design for surrogate modeling in Bayesian inverse problems (2402.16520v3)

Published 26 Feb 2024 in stat.ME

Abstract: Sequential design is a highly active field of research in active learning which provides a general framework for designing computer experiments with limited computational budgets. It aims to create efficient surrogate models to replace complex computer codes. Some sequential design strategies can be understood within the Stepwise Uncertainty Reduction (SUR) framework. In the SUR framework, each new design point is chosen by minimizing the expectation of a metric of uncertainty with respect to the yet unknown new data point. These methods offer an accessible framework for sequential experiment design, including almost sure convergence for common uncertainty functionals. This paper introduces two strategies. The first one, entitled Constraint Set Query (CSQ) is adapted from D-optimal designs where the search space is constrained in a ball for the Mahalanobis distance around the maximum a posteriori. The second, known as the IP-SUR (Inverse Problem SUR) strategy, uses a weighted-integrated mean squared prediction error as the uncertainty metric and is derived from SUR methods. It is tractable for Gaussian process surrogates with continuous sample paths. It comes with a theoretical guarantee for the almost sure convergence of the uncertainty functional. The premises of this work are highlighted in various test cases, in which these two strategies are compared to other sequential designs.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (62)
  1. Sequential bayesian optimal experimental design for structural reliability analysis. Statistics and Computing, 31(3):27.
  2. Exact optimal designs for computer experiments via kriging metamodelling. Journal of Statistical Planning and Inference, 140(9):2607–2617.
  3. Adaptive design of experiments for conservative estimation of excursion sets. Technometrics, 63(1):13–26.
  4. A supermartingale approach to Gaussian process based sequential design of experiments. Bernoulli, 25(4A):2883 – 2919.
  5. Sequential design of computer experiments for the estimation of a probability of failure. Statistics and Computing, 22:773–793.
  6. Betancourt, M. (2017). A conceptual introduction to Hamiltonian Monte Carlo. arXiv preprint arXiv:1701.02434.
  7. Bogachev, V. I. (1998). Gaussian measures. American Mathematical Soc.
  8. Multi-task gaussian process prediction. Advances in neural information processing systems, 20.
  9. A limited memory algorithm for bound constrained optimization. SIAM Journal on scientific computing, 16(5):1190–1208.
  10. Fast parallel kriging-based stepwise uncertainty reduction with application to the identification of an excursion set. Technometrics, 56(4):455–465.
  11. Prior Distributions for Objective Bayesian Analysis. Bayesian Analysis, 13(2):627 – 679.
  12. The Bayesian Approach to Inverse Problems, pages 311–428. Springer International Publishing, Cham.
  13. D-optimal designs. Chemometrics and intelligent laboratory systems, 30(2):199–210.
  14. Sequential optimization and reliability assessment method for efficient probabilistic design. In International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, volume 36223, pages 871–880.
  15. Metamodel-based importance sampling for structural reliability analysis. Probabilistic Engineering Mechanics, 33:47–57.
  16. Regularization of inverse problems, volume 375. Springer Science & Business Media.
  17. Sequential design for functional calibration of computer models. Technometrics, 60(3):286–296.
  18. Dispersion of the neutron emission in U-235 fission. Journal of Nuclear Energy (1954), 3(1-2):64–IN10.
  19. Surrogate and reduced-order modeling: a comparison of approaches for large-scale statistical inverse problems. Large-Scale Inverse Problems and Quantification of Uncertainty, pages 123–149.
  20. Third moment of the number of neutrons detected in short time intervals. Journal of Nuclear Science and Technology, 5(2):48–59.
  21. Ghosh, M. (2011). Objective Priors: An Introduction for Frequentists. Statistical Science, 26(2):187 – 202.
  22. Adaptive design and analysis of supercomputer experiments. Technometrics, 51(2):130–145.
  23. The weighted likelihood. Canadian Journal of Statistics, 30(3):347–371.
  24. Bayesian optimization objective-based experimental design. In 2020 American control conference (ACC), pages 3405–3411. IEEE.
  25. Jeffreys, H. (1946). An invariant form for the prior probability in estimation problems. Proceedings of the Royal Society of London. Series A. Mathematical and Physical Sciences, 186(1007):453–461.
  26. Minimax and maximin distance designs. Journal of statistical planning and inference, 26(2):131–148.
  27. Efficient global optimization of expensive black-box functions. Journal of Global optimization, 13:455–492.
  28. Statistical and computational inverse problems, volume 160. Springer Science & Business Media.
  29. Bayesian calibration of computer models. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 63(3):425–464.
  30. Application-driven sequential designs for simulation experiments: Kriging metamodelling. Journal of the operational research society, 55(8):876–883.
  31. On information and sufficiency. The annals of mathematical statistics, 22(1):79–86.
  32. Multi-output gaussian processes for inverse uncertainty quantification in neutron noise analysis. Nuclear Science and Engineering, 197(8):1928–1951.
  33. A sampling technique enhancing accuracy and efficiency of metamodel-based rbdo: Constraint boundary sampling. Computers & Structures, 86(13-14):1463–1476.
  34. Adaptive construction of surrogates for the Bayesian solution of inverse problems. SIAM Journal on Scientific Computing, 36(3):A1163–A1186.
  35. Seismic fragility curves for structures using non-parametric representations. Frontiers of Structural and Civil Engineering, 11:169–186.
  36. Mitchell, T. J. (2000). An algorithm for the construction of “d-optimal” experimental designs. Technometrics, 42(1):48–54.
  37. Neal, R. M. (2001). Annealed importance sampling. Statistics and computing, 11:125–139.
  38. An engineering design methodology with multistage bayesian surrogates and optimal sampling. Research in Engineering Design, 8:189–206.
  39. Neutron fluctuations: A treatise on the physics of branching processes. Elsevier.
  40. Adaptive designs of experiments for accurate approximation of target regions. Journal of Mechanical Design, 132.
  41. Sequential experiment design for contour estimation from complex computer codes. Technometrics, 50(4):527–541.
  42. Gaussian processes for machine learning. MIT Press.
  43. Designs for computer experiments. Technometrics, 31(1):41–47.
  44. Probabilistic programming in python using pymc3. PeerJ Computer Science, 2:e55.
  45. The design and analysis of computer experiments, volume 1. Springer.
  46. Prior information and uncertainty in inverse problems. Geophysics, 66(2):389–397.
  47. Polynomial-chaos-based kriging. International Journal for Uncertainty Quantification, 5(2).
  48. Taking the human out of the loop: A review of Bayesian optimization. Proceedings of the IEEE, 104(1):148–175.
  49. Maximum entropy sampling. Journal of applied statistics, 14(2):165–170.
  50. Sequential design of computer experiments for the computation of bayesian model evidence. SIAM/ASA Journal on Uncertainty Quantification, 9(1):260–279.
  51. Sequential design of computer experiments for the solution of bayesian inverse problems. SIAM/ASA Journal on Uncertainty Quantification, 5(1):640–664.
  52. Practical Bayesian optimization of machine learning algorithms. Advances in neural information processing systems, 25.
  53. Stein, M. (1987). Large sample properties of simulations using latin hypercube sampling. Technometrics, 29(2):143–151.
  54. Sequential design of multi-fidelity computer experiments: maximizing the rate of stepwise uncertainty reduction. Technometrics, 64(2):199–209.
  55. Stuart, A. M. (2010). Inverse problems: a bayesian perspective. Acta numerica, 19:451–559.
  56. Sequential bayesian experimental design for calibration of expensive simulation models. Technometrics, 0(ja):1–26.
  57. Probability distributions on Banach spaces, volume 14. Springer Science & Business Media.
  58. Reproducing kernel hilbert spaces of gaussian priors. IMS Collections, 3:200–222.
  59. An informational approach to the global optimization of expensive-to-evaluate functions. Journal of Global Optimization, 44:509–534.
  60. D-optimal designs for poisson regression models. Journal of statistical planning and inference, 136(8):2831–2845.
  61. Generalized simulated annealing algorithm and its application to the thomson model. Physics Letters A, 233(3):216–220.
  62. A sequential design approach for calibrating dynamic computer simulators. SIAM/ASA Journal on Uncertainty Quantification, 7(4):1245–1274.
Citations (3)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com