Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Adaptive Gradient Enhanced Gaussian Process Surrogates for Inverse Problems (2404.01864v1)

Published 2 Apr 2024 in math.NA and cs.NA

Abstract: Generating simulated training data needed for constructing sufficiently accurate surrogate models to be used for efficient optimization or parameter identification can incur a huge computational effort in the offline phase. We consider a fully adaptive greedy approach to the computational design of experiments problem using gradient-enhanced Gaussian process regression as surrogates. Designs are incrementally defined by solving an optimization problem for accuracy given a certain computational budget. We address not only the choice of evaluation points but also of required simulation accuracy, both of values and gradients of the forward model. Numerical results show a significant reduction of the computational effort compared to just position-adaptive and static designs as well as a clear benefit of including gradient information into the surrogate training.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (35)
  1. Gaussian processes for Bayesian inverse problems associated with linear partial differential equations. arxiv:2307.08343, 2023.
  2. Efficient space-filling and non-collapsing sequential design strategies for simulation-based modeling. European Journal of Operational Research, 214:683–696, 2011.
  3. P. Deuflhard. Newton Methods for Nonlinear Problems. Affine Invariance and Adaptive Algorithms, volume 35 of Computational Mathematics. Springer, 2004.
  4. P. Deuflhard and M. Weiser. Adaptive numerical solution of PDEs. de Gruyter, 2012.
  5. D. Duvenau. Automatic Model Construction with Gaussian Processes. PhD thesis, University of Cambridge, 2014.
  6. Regularization of inverse problems. Kluwer, 1996.
  7. Scaling gaussian process regression with derivatives. In S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett, editors, Advances in Neural Information Processing Systems, volume 31. Curran Associates, Inc., 2018.
  8. Efficient global sensitivity analysis for silicon line gratings using polynomial chaos. In Bernd Bodermann, Karsten Frenner, and Richard M. Silver, editors, Modeling Aspects in Optical Metrology VII, volume 11057, page 110570J. International Society for Optics and Photonics, SPIE, 2019.
  9. Overview of modern design of experiments methods for computational simulations (invited). In 41st Aerospace Sciences Meeting and Exhibit, AIAA 2003-649, pages 1–17, 2003.
  10. A. Griewank and A. Walther. Evaluating derivatives: principles and techniques of algorithmic differentiation. SIAM, 2008.
  11. Quantifying parameter uncertainties in optical scatterometry using Bayesian inversion. In B. Bodermann, K. Frenner, and R. M. Silver, editors, Modeling Aspects in Optical Metrology VI, volume 10330, page 1033004. International Society for Optics and Photonics, SPIE, 2017.
  12. P. Hennig and C.J. Schuler. Entropy search for information-efficient global optimization. Journal of Machine Learning Research, 13:1809–1837, 2012.
  13. V. Joseph and Y. Hung. Orthogonal-maximin latin hypercube designs. Statistica Sinica, 18:171–186, 2008.
  14. J. Kaipio and E. Somersalo. Statistical and Computational Inverse Problems. Springer, 2005.
  15. M. Kuß. Gaussian Process Models for Robust Regression, Classification, and Reinforcement Learning. PhD thesis, Technische Universität Darmstadt, 2006.
  16. Adaptive sampling applied to multivariate, multiple output rational interpolation models with application to microwave circuits. International Journal of RF and Microwave Computer-Aided Engineering, 12(4):332–340, 2002.
  17. Quadratic Forms in Random Variables. Marcel Dekker, 1992.
  18. J. Močkus. On Bayesian methods for seeking the extremum. In Optimization Techniques IFIP Technical Conference Novosibirsk, pages 400–404. Springer, 1975.
  19. A sparse control approach to optimal sensor placement in pde-constrained parameter estimation problems. Numer. Math., 143:943–984, 2019.
  20. A generalized probabilistic learning approach for multi-fidelity uncertainty quantification in complex physical simulations. Comp. Meth. Appl. Mech. Eng., 400:115600, 2022.
  21. Surrogate-based analysis and optimization. Progress in Aerospace Sciences, 41(1):1–28, 2005.
  22. Approximation methods for gaussian process regression. In Large-Scale Kernel Machines, Neural Information Processing, pages 203–223. MIT Press, 2007.
  23. C. Rasmussen and C.K.I. Williams. Gaussian Processes for Machine Learning. MIT Press, 2006.
  24. Using sparse kernels to design computer experiments with tunable precision. In Proceedings of COMPSTAT 2016, pages 397–408, 2016.
  25. Using Gaussian process regression for efficient parameter reconstruction. In Metrology, Inspection, and Process Control for Microlithography XXXIII, volume 10959. SPIE, 2019.
  26. Using Gaussian process regression for efficient parameter reconstruction. In Vladimir A. Ukraintsev and Ofer Adan, editors, Metrology, Inspection, and Process Control for Microlithography XXXIII, volume 10959, page 1095911. International Society for Optics and Photonics, SPIE, 2019.
  27. P. Semler and M. Weiser. Adaptive Gaussian process regression for efficient building of surrogate models in inverse problems. Inverse Problems, 39:125003, 2023.
  28. Derivative observations in gaussian process models of dynamic systems. pages 1033–1040, Jan 2002.
  29. M. Sugiyama. Active learning in approximately linear regression based on conditional expectation of generalization error. Journal of Machine Learning Research, 7:141––166, 2006.
  30. B. Vexler. Adaptive finite element methods for parameter identification problems. In H.G. Bock, T. Carraro, W. Jäger, S. Körkel, R. Rannacher, and J.P. Schlöder, editors, Model Based Parameter Estimation: Theory and Applications, pages 31–54. Springer, 2013.
  31. M. Weiser and S. Ghosh. Theoretically optimal inexact spectral deferred correction methods. Communications in Applied Mathematics and Computational Science, 13(1):53–86, 2018.
  32. Exploiting gradients and hessians in Bayesian optimization and quadrature.
  33. Bayesian optimization with gradients. In Advances in Neural Information Processing Systems, volume 30, pages 3–6. Curran Associates, 2017.
  34. Deep ultraviolet scatterometer for dimensional characterization of nanostructures: system improvements and test measurements. Measurem. Sci. Techn., 22(9):094024, 2011.
  35. A. Zaytsev. Reliable surrogate modeling of engineering data with more than two levels of fidelity. In 2016 7th International Conference on Mechanical and Aerospace Engineering (ICMAE), pages 341–345, 2016.
Citations (2)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com