Emergent Mind

Simulation Based Bayesian Optimization

(2401.10811)
Published Jan 19, 2024 in stat.ML and cs.LG

Abstract

Bayesian Optimization (BO) is a powerful method for optimizing black-box functions by combining prior knowledge with ongoing function evaluations. BO constructs a probabilistic surrogate model of the objective function given the covariates, which is in turn used to inform the selection of future evaluation points through an acquisition function. For smooth continuous search spaces, Gaussian Processes (GPs) are commonly used as the surrogate model as they offer analytical access to posterior predictive distributions, thus facilitating the computation and optimization of acquisition functions. However, in complex scenarios involving optimizations over categorical or mixed covariate spaces, GPs may not be ideal. This paper introduces Simulation Based Bayesian Optimization (SBBO) as a novel approach to optimizing acquisition functions that only requires \emph{sampling-based} access to posterior predictive distributions. SBBO allows the use of surrogate probabilistic models tailored for combinatorial spaces with discrete variables. Any Bayesian model in which posterior inference is carried out through Markov chain Monte Carlo can be selected as the surrogate model in SBBO. In applications involving combinatorial optimization, we demonstrate empirically the effectiveness of SBBO method using various choices of surrogate models.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Sign up for a free account or log in to generate a summary of this paper:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.

References
  1. “Bayesian-optimal design via interacting particle systems.” Journal of the American Statistical association, 101(474): 773–785.
  2. “Bayesian optimization of combinatorial structures.” In International Conference on Machine Learning, 462–471. PMLR.
  3. “Random search for hyper-parameter optimization.” Journal of machine learning research, 13(2).
  4. “Decision analysis by augmented probability simulation.” Management Science, 45(7): 995–1007.
  5. “A gridding method for Bayesian sequential decision problems.” Journal of Computational and Graphical Statistics, 12(3): 566–584.
  6. “BART: Bayesian additive regression trees.”
  7. “Fourier representations for black-box optimization over categorical variables.” In Proceedings of the AAAI Conference on Artificial Intelligence, volume 36, 10156–10165.
  8. “A sequential Monte Carlo algorithm to incorporate model uncertainty in Bayesian sequential design.” Journal of Computational and Graphical Statistics, 23(1): 3–24.
  9. “Ngboost: Natural gradient boosting for probabilistic prediction.” In International conference on machine learning, 2690–2700. PMLR.
  10. A Tutorial on Bayesian Optimization
  11. “Dealing with categorical and integer-valued variables in bayesian optimization with gaussian processes.” Neurocomputing, 380: 20–35.
  12. “Automatic chemical design using a data-driven continuous representation of molecules.” ACS central science, 4(2): 268–276.
  13. “Contamination control in food supply chain.” In Proceedings of the 2010 Winter Simulation Conference, 2678–2681. IEEE.
  14. “Sequential model-based optimization for general algorithm configuration.” In Learning and Intelligent Optimization: 5th International Conference, LION 5, Rome, Italy, January 17-21, 2011. Selected Papers 5, 507–523. Springer.
  15. Auto-Encoding Variational Bayes
  16. “Bayesian optimization with a finite budget: An approximate dynamic programming approach.” Advances in Neural Information Processing Systems, 29.
  17. “ViennaRNA Package 2.0.” Algorithms for molecular biology, 6: 1–14.
  18. Mockus, J. (1994). “Application of Bayesian approach to numerical methods of global and stochastic optimization.” Journal of Global Optimization, 4: 347–365.
  19. Müller, P. (2005). “Simulation based optimal design.” Handbook of Statistics, 25: 509–518.
  20. “Optimal Bayesian design by inhomogeneous Markov chain simulation.” Journal of the American Statistical Association, 99(467): 788–798.
  21. “Combinatorial bayesian optimization using the graph cartesian product.” Advances in Neural Information Processing Systems, 32.
  22. Gaussian processes for machine learning, volume 1. Springer.
  23. Spears, W. M. (1993). “Simulated annealing for hard satisfiability problems.” Cliques, Coloring, and Satisfiability, 26: 533–558.
  24. “Recent advances in Bayesian optimization.” ACM Computing Surveys, 55(13s): 1–36.
  25. “Implementation of backward induction for sequentially adaptive clinical trials.” Journal of Computational and Graphical Statistics, 15(2): 398–413.
  26. “Practical two-step lookahead Bayesian optimization.” Advances in neural information processing systems, 32.
  27. “Optimal computer folding of large RNA sequences using thermodynamics and auxiliary information.” Nucleic acids research, 9(1): 133–148.

Show All 27