Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 58 tok/s
Gemini 2.5 Pro 52 tok/s Pro
GPT-5 Medium 13 tok/s Pro
GPT-5 High 15 tok/s Pro
GPT-4o 86 tok/s Pro
Kimi K2 208 tok/s Pro
GPT OSS 120B 447 tok/s Pro
Claude Sonnet 4 36 tok/s Pro
2000 character limit reached

Simulation Based Bayesian Optimization (2401.10811v2)

Published 19 Jan 2024 in stat.ML and cs.LG

Abstract: Bayesian Optimization (BO) is a powerful method for optimizing black-box functions by combining prior knowledge with ongoing function evaluations. BO constructs a probabilistic surrogate model of the objective function given the covariates, which is in turn used to inform the selection of future evaluation points through an acquisition function. For smooth continuous search spaces, Gaussian Processes (GPs) are commonly used as the surrogate model as they offer analytical access to posterior predictive distributions, thus facilitating the computation and optimization of acquisition functions. However, in complex scenarios involving optimization over categorical or mixed covariate spaces, GPs may not be ideal. This paper introduces Simulation Based Bayesian Optimization (SBBO) as a novel approach to optimizing acquisition functions that only requires sampling-based access to posterior predictive distributions. SBBO allows the use of surrogate probabilistic models tailored for combinatorial spaces with discrete variables. Any Bayesian model in which posterior inference is carried out through Markov chain Monte Carlo can be selected as the surrogate model in SBBO. We demonstrate empirically the effectiveness of SBBO using various choices of surrogate models in applications involving combinatorial optimization. choices of surrogate models.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (27)
  1. “Bayesian-optimal design via interacting particle systems.” Journal of the American Statistical association, 101(474): 773–785.
  2. “Bayesian optimization of combinatorial structures.” In International Conference on Machine Learning, 462–471. PMLR.
  3. “Random search for hyper-parameter optimization.” Journal of machine learning research, 13(2).
  4. “Decision analysis by augmented probability simulation.” Management Science, 45(7): 995–1007.
  5. “A gridding method for Bayesian sequential decision problems.” Journal of Computational and Graphical Statistics, 12(3): 566–584.
  6. “BART: Bayesian additive regression trees.”
  7. “Fourier representations for black-box optimization over categorical variables.” In Proceedings of the AAAI Conference on Artificial Intelligence, volume 36, 10156–10165.
  8. “A sequential Monte Carlo algorithm to incorporate model uncertainty in Bayesian sequential design.” Journal of Computational and Graphical Statistics, 23(1): 3–24.
  9. “Ngboost: Natural gradient boosting for probabilistic prediction.” In International conference on machine learning, 2690–2700. PMLR.
  10. Frazier, P. I. (2018). “A tutorial on Bayesian optimization.” arXiv preprint arXiv:1807.02811.
  11. “Dealing with categorical and integer-valued variables in bayesian optimization with gaussian processes.” Neurocomputing, 380: 20–35.
  12. “Automatic chemical design using a data-driven continuous representation of molecules.” ACS central science, 4(2): 268–276.
  13. “Contamination control in food supply chain.” In Proceedings of the 2010 Winter Simulation Conference, 2678–2681. IEEE.
  14. “Sequential model-based optimization for general algorithm configuration.” In Learning and Intelligent Optimization: 5th International Conference, LION 5, Rome, Italy, January 17-21, 2011. Selected Papers 5, 507–523. Springer.
  15. “Auto-encoding variational bayes.” arXiv preprint arXiv:1312.6114.
  16. “Bayesian optimization with a finite budget: An approximate dynamic programming approach.” Advances in Neural Information Processing Systems, 29.
  17. “ViennaRNA Package 2.0.” Algorithms for molecular biology, 6: 1–14.
  18. Mockus, J. (1994). “Application of Bayesian approach to numerical methods of global and stochastic optimization.” Journal of Global Optimization, 4: 347–365.
  19. Müller, P. (2005). “Simulation based optimal design.” Handbook of Statistics, 25: 509–518.
  20. “Optimal Bayesian design by inhomogeneous Markov chain simulation.” Journal of the American Statistical Association, 99(467): 788–798.
  21. “Combinatorial bayesian optimization using the graph cartesian product.” Advances in Neural Information Processing Systems, 32.
  22. Gaussian processes for machine learning, volume 1. Springer.
  23. Spears, W. M. (1993). “Simulated annealing for hard satisfiability problems.” Cliques, Coloring, and Satisfiability, 26: 533–558.
  24. “Recent advances in Bayesian optimization.” ACM Computing Surveys, 55(13s): 1–36.
  25. “Implementation of backward induction for sequentially adaptive clinical trials.” Journal of Computational and Graphical Statistics, 15(2): 398–413.
  26. “Practical two-step lookahead Bayesian optimization.” Advances in neural information processing systems, 32.
  27. “Optimal computer folding of large RNA sequences using thermodynamics and auxiliary information.” Nucleic acids research, 9(1): 133–148.

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

X Twitter Logo Streamline Icon: https://streamlinehq.com