2000 character limit reached
Randomized Maximum Likelihood via High-Dimensional Bayesian Optimization (2204.08022v3)
Published 17 Apr 2022 in stat.CO
Abstract: Posterior sampling for high-dimensional Bayesian inverse problems is a common challenge in real-world applications. Randomized Maximum Likelihood (RML) is an optimization based methodology that gives samples from an approximation to the posterior distribution. We develop a high-dimensional Bayesian Optimization (BO) approach based on Gaussian Process (GP) surrogate models to solve the RML problem. We demonstrate the benefits of our approach in comparison to alternative optimization methods on a variety of synthetic and real-world Bayesian inverse problems, including medical and magnetohydrodynamics applications.
- “Abc for climate: Dealing with expensive simulators,” Handbook of Approximate Bayesian Computation, 2018.
- “The bayesian formulation of eit: Analysis and algorithms,” arXiv: Probability, 2015.
- “Bayesian inversion in resin transfer molding,” Inverse Problems, 2018.
- Peter K Kitanidis, “Quasi-linear geostatistical theory for inversing,” Water resources research, vol. 31, no. 10, pp. 2411–2419, 1995.
- “Conditioning permeability fields to pressure data,” European Association of Geoscientists & Engineers, 1996.
- “A deep-learning-based surrogate model for data assimilation in dynamic subsurface flow problems,” Journal of Computational Physics, vol. 413, July 2020.
- “Randomized maximum likelihood based posterior sampling,” Computational Geosciences, pp. 1–23, 2022.
- “Accelerating MCMC with active subspaces,” arXiv e-prints, p. arXiv:1510.00024, Sept. 2015.
- “Mathematical analysis and dynamic active subspaces for a long term model of hiv,” Mathematical Biosciences and Engineering, vol. 14, 04 2016.
- “Re-examining linear embeddings for high-dimensional bayesian optimization,” Advances in neural information processing systems, vol. 33, 2020.
- Carl Edward Rasmussen and Christopher K. I. Williams, Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning), The MIT Press, 2005.
- “Bayesian optimization in a billion dimensions via random embeddings,” Journal of Artificial Intelligence Research, vol. 55, 2016.
- “Gaussian processes for history-matching: application to an unconventional gas reservoir,” Computational Geosciences, vol. 21, 04 2017.
- P. Constantine and R. Howard, “Active subspaces data sets,” https://github.com/paulcon/as-data-sets.
- “A modified seir model for the spread of ebola in western africa and metrics for resource allocation,” Applied mathematics and computation, vol. 324, pp. 141–155, 2018.
- “Dimension reduction in magnetohydrodynamics power generation models: Dimensional analysis and active subspaces,” Statistical Analysis and Data Mining: The ASA Data Science Journal, vol. 10, no. 5, pp. 312–325, 2017.
- M. Powell, “A view of algorithms for optimization without derivatives,” Mathematics TODAY, vol. 43, 01 2007.
- “Scalable global optimization via local bayesian optimization,” in Advances in Neural Information Processing Systems, H. Wallach, H. Larochelle, A. Beygelzimer, F. d'Alché-Buc, E. Fox, and R. Garnett, Eds., 2019, vol. 32.
- “CMA-ES/pycma on Github,” Zenodo, DOI:10.5281/zenodo.2559634, Feb. 2019.
- “A fast and elitist multiobjective genetic algorithm: Nsga-ii,” IEEE Transactions on Evolutionary Computation, vol. 6, no. 2, pp. 182–197, 2002.
- “Multi-objective bayesian optimization over high-dimensional search spaces,” in Uncertainty in Artificial Intelligence. PMLR, 2022.
- “Gaussian process optimization in the bandit setting: No regret and experimental design,” in Proceedings of the 27th International Conference on International Conference on Machine Learning, Madison, WI, USA, 2010, ICML’10, p. 1015–1022, Omnipress.
- “Substitute distance assignments in nsga-ii for handling many-objective optimization problems,” in Proceedings of the 4th International Conference on Evolutionary Multi-Criterion Optimization, Berlin, Heidelberg, 2007, EMO’07, p. 727–741, Springer-Verlag.