Simulation Based Bayesian Optimization (2401.10811v2)
Abstract: Bayesian Optimization (BO) is a powerful method for optimizing black-box functions by combining prior knowledge with ongoing function evaluations. BO constructs a probabilistic surrogate model of the objective function given the covariates, which is in turn used to inform the selection of future evaluation points through an acquisition function. For smooth continuous search spaces, Gaussian Processes (GPs) are commonly used as the surrogate model as they offer analytical access to posterior predictive distributions, thus facilitating the computation and optimization of acquisition functions. However, in complex scenarios involving optimization over categorical or mixed covariate spaces, GPs may not be ideal. This paper introduces Simulation Based Bayesian Optimization (SBBO) as a novel approach to optimizing acquisition functions that only requires sampling-based access to posterior predictive distributions. SBBO allows the use of surrogate probabilistic models tailored for combinatorial spaces with discrete variables. Any Bayesian model in which posterior inference is carried out through Markov chain Monte Carlo can be selected as the surrogate model in SBBO. We demonstrate empirically the effectiveness of SBBO using various choices of surrogate models in applications involving combinatorial optimization. choices of surrogate models.
- “Bayesian-optimal design via interacting particle systems.” Journal of the American Statistical association, 101(474): 773–785.
- “Bayesian optimization of combinatorial structures.” In International Conference on Machine Learning, 462–471. PMLR.
- “Random search for hyper-parameter optimization.” Journal of machine learning research, 13(2).
- “Decision analysis by augmented probability simulation.” Management Science, 45(7): 995–1007.
- “A gridding method for Bayesian sequential decision problems.” Journal of Computational and Graphical Statistics, 12(3): 566–584.
- “BART: Bayesian additive regression trees.”
- “Fourier representations for black-box optimization over categorical variables.” In Proceedings of the AAAI Conference on Artificial Intelligence, volume 36, 10156–10165.
- “A sequential Monte Carlo algorithm to incorporate model uncertainty in Bayesian sequential design.” Journal of Computational and Graphical Statistics, 23(1): 3–24.
- “Ngboost: Natural gradient boosting for probabilistic prediction.” In International conference on machine learning, 2690–2700. PMLR.
- Frazier, P. I. (2018). “A tutorial on Bayesian optimization.” arXiv preprint arXiv:1807.02811.
- “Dealing with categorical and integer-valued variables in bayesian optimization with gaussian processes.” Neurocomputing, 380: 20–35.
- “Automatic chemical design using a data-driven continuous representation of molecules.” ACS central science, 4(2): 268–276.
- “Contamination control in food supply chain.” In Proceedings of the 2010 Winter Simulation Conference, 2678–2681. IEEE.
- “Sequential model-based optimization for general algorithm configuration.” In Learning and Intelligent Optimization: 5th International Conference, LION 5, Rome, Italy, January 17-21, 2011. Selected Papers 5, 507–523. Springer.
- “Auto-encoding variational bayes.” arXiv preprint arXiv:1312.6114.
- “Bayesian optimization with a finite budget: An approximate dynamic programming approach.” Advances in Neural Information Processing Systems, 29.
- “ViennaRNA Package 2.0.” Algorithms for molecular biology, 6: 1–14.
- Mockus, J. (1994). “Application of Bayesian approach to numerical methods of global and stochastic optimization.” Journal of Global Optimization, 4: 347–365.
- Müller, P. (2005). “Simulation based optimal design.” Handbook of Statistics, 25: 509–518.
- “Optimal Bayesian design by inhomogeneous Markov chain simulation.” Journal of the American Statistical Association, 99(467): 788–798.
- “Combinatorial bayesian optimization using the graph cartesian product.” Advances in Neural Information Processing Systems, 32.
- Gaussian processes for machine learning, volume 1. Springer.
- Spears, W. M. (1993). “Simulated annealing for hard satisfiability problems.” Cliques, Coloring, and Satisfiability, 26: 533–558.
- “Recent advances in Bayesian optimization.” ACM Computing Surveys, 55(13s): 1–36.
- “Implementation of backward induction for sequentially adaptive clinical trials.” Journal of Computational and Graphical Statistics, 15(2): 398–413.
- “Practical two-step lookahead Bayesian optimization.” Advances in neural information processing systems, 32.
- “Optimal computer folding of large RNA sequences using thermodynamics and auxiliary information.” Nucleic acids research, 9(1): 133–148.
- Roi Naveiro (18 papers)
- Becky Tang (2 papers)