Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Practical Path-based Bayesian Optimization (2312.00622v1)

Published 1 Dec 2023 in cs.LG, math.OC, and stat.ME

Abstract: There has been a surge in interest in data-driven experimental design with applications to chemical engineering and drug manufacturing. Bayesian optimization (BO) has proven to be adaptable to such cases, since we can model the reactions of interest as expensive black-box functions. Sometimes, the cost of this black-box functions can be separated into two parts: (a) the cost of the experiment itself, and (b) the cost of changing the input parameters. In this short paper, we extend the SnAKe algorithm to deal with both types of costs simultaneously. We further propose extensions to the case of a maximum allowable input change, as well as to the multi-objective setting.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (29)
  1. BoTorch: A Framework for Efficient Monte-Carlo Bayesian Optimization. In Advances in Neural Information Processing Systems 33, 2020.
  2. Truncated variance reduction: A unified approach to Bayesian optimization and level-set estimation. Advances in Neural Information Processing Systems, 29, 2016.
  3. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE transactions on evolutionary computation, 6(2):182–197, 2002.
  4. Summit: Benchmarking Machine Learning Methods for Reaction Optimisation. Chemistry Methods, February 2021.
  5. SnAKe: Bayesian Optimization with Pathwise Exploration. In Advances in Neural Information Processing Systems, volume 35, pages 35226–35239, 2022.
  6. Gpytorch: Blackbox matrix-matrix Gaussian Process inference with GPU acceleration. Advances in Neural Information Processing Systems, pages 7576–7586, 2018.
  7. Constrained bayesian optimization for automatic chemical design using variational autoencoders. Chemical Science, 11(2):577–586, 2020.
  8. Exploring network structure, dynamics, and function using NetworkX. Technical report, Los Alamos National Lab. (LANL), 2008.
  9. Efficient global optimization of expensive black-box functions. Journal of Global Optimization, 13(4):455–492, 1998.
  10. Parallelised Bayesian Optimisation via Thompson Sampling. In Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, pages 133–142, 2018.
  11. Adam: A Method for Stochastic Optimization. International Conference on Learning Representations, 12 2014.
  12. Optimization by simulated annealing. Science, 220(4598):671–680, 1983.
  13. Active Exploration via Experiment Design in Markov Chains. In Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, volume 206, pages 7349–7374, 2023.
  14. Sequential approximate multiobjective optimization using computational intelligence. Springer Science & Business Media, 2009.
  15. Regret for Expected Improvement over the Best-Observed Value and Stopping Condition. In Asian conference on machine learning, pages 279–294, 2017.
  16. A flexible framework for multi-objective bayesian optimization using random scalarizations. In Uncertainty in Artificial Intelligence, pages 766–776. PMLR, 2020.
  17. PyTorch: An Imperative Style, High-Performance Deep Learning Library. In Advances in Neural Information Processing Systems, pages 8024–8035. Curran Associates, Inc., 2019.
  18. Efficient Multi-Step Lookahead Bayesian Optimization with Local Search Constraints. In 2022 IEEE 61st Conference on Decision and Control (CDC), pages 123–129, 2022.
  19. LSR-BO: Local search region constrained Bayesian optimization for performance optimization of vapor compression systems. In 2023 American Control Conference (ACC), pages 576–582. IEEE, 2023.
  20. Movement Penalized Bayesian Optimization with Application to Wind Energy Systems. In Advances in Neural Information Processing Systems, volume 35, pages 27036–27048, 2022.
  21. Carl Edward Rasmussen and Christopher K. I. Williams. Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning). The MIT Press, 2005.
  22. A Bayesian Optimization Approach for Water Resources Monitoring Through An Autonomous Surface Vehicle: The Ypacarai Lake Case Study. IEEE Access, 9:9163–9179, 2021.
  23. Taking the Human Out of the Loop: A Review of Bayesian Optimization. Proceedings of the IEEE, 104(1):148–175, 2016.
  24. Practical Bayesian optimization of machine learning algorithms. Advances in Neural Information Processing Systems, 25, 2012.
  25. Multi-objective constrained optimization for energy applications via tree ensembles. Applied Energy, 306:118061, 2022.
  26. Joint entropy search for multi-objective Bayesian optimization. In Advances in Neural Information Processing Systems, volume 35, pages 9922–9938, 2022.
  27. Process-constrained Batch Bayesian Optimisation. In Advances in Neural Information Processing Systems. Curran Associates, Inc., 2017.
  28. Efficiently Sampling Functions from Gaussian Process Posteriors. In International Conference on Machine Learning, pages 10292–10302, 13-18 Jul 2020.
  29. MONGOOSE: Path-wise Smooth Bayesian Optimisation via Meta-learning. arXiv preprint arXiv:2302.11533, 2023.
Citations (2)

Summary

We haven't generated a summary for this paper yet.