Comparison of High-Dimensional Bayesian Optimization Algorithms on BBOB (2303.00890v3)
Abstract: Bayesian Optimization (BO) is a class of black-box, surrogate-based heuristics that can efficiently optimize problems that are expensive to evaluate, and hence admit only small evaluation budgets. BO is particularly popular for solving numerical optimization problems in industry, where the evaluation of objective functions often relies on time-consuming simulations or physical experiments. However, many industrial problems depend on a large number of parameters. This poses a challenge for BO algorithms, whose performance is often reported to suffer when the dimension grows beyond 15 variables. Although many new algorithms have been proposed to address this problem, it is not well understood which one is the best for which optimization scenario. In this work, we compare five state-of-the-art high-dimensional BO algorithms, with vanilla BO and CMA-ES on the 24 BBOB functions of the COCO environment at increasing dimensionality, ranging from 10 to 60 variables. Our results confirm the superiority of BO over CMA-ES for limited evaluation budgets and suggest that the most promising approach to improve BO is the use of trust regions. However, we also observe significant performance differences for different function landscapes and budget exploitation phases, indicating improvement potential, e.g., through hybridization of algorithmic components.
- High Dimensional Bayesian Optimization with Kernel Principal Component Analysis. In Proc. of Parallel Problem Solving from Nature (PPSN), Vol. 13398. Springer, 118–131. https://doi.org/10.1007/978-3-031-14714-2_9
- R. Bellman. 1966. Dynamic Programming. Science (New York, N.Y.) 153, 3731 (July 1966), 34–37. https://doi.org/10.1126/science.153.3731.34
- Alain Berlinet and Christine Thomas-Agnan. 2011. Reproducing kernel Hilbert spaces in probability and statistics. Springer Science & Business Media.
- Mickaël Binois and Nathan Wycoff. 2022. A Survey on High-dimensional Gaussian Process Modeling with Application to Bayesian Optimization. ACM Trans. Evol. Learn. Optim. 2, 2 (2022), 8:1–8:26. https://doi.org/10.1145/3545611
- Bayesian Optimization for Learning Gaits under Uncertainty. Annals of Mathematics and Artificial Intelligence 76, 1 (Feb. 2016), 5–23. https://doi.org/10.1007/s10472-015-9463-9
- Semi-supervised Embedding Learning for High-dimensional Bayesian Optimization. CoRR abs/2005.14601 (2020). arXiv:2005.14601 https://arxiv.org/abs/2005.14601
- Randomly projected additive Gaussian processes for regression. In International Conference on Machine Learning. PMLR, 2453–2463.
- Thomas J DiCiccio and Bradley Efron. 1996. Bootstrap confidence intervals. Statistical science 11, 3 (1996), 189–228.
- IOHprofiler: A benchmarking and profiling tool for iterative optimization heuristics. arXiv preprint arXiv:1810.05281 (2018). https://iohprofiler.github.io/.
- Additive kernels for Gaussian process modeling. arXiv preprint arXiv:1103.4023 (2011).
- David Eriksson and Martin Jankowiak. 2021. High-dimensional Bayesian optimization with sparse axis-aligned subspaces. In Uncertainty in Artificial Intelligence. PMLR, 493–503.
- Scalable global optimization via local Bayesian Optimization. Advances in Neural Information Processing Systems 32 (2019).
- Engineering Design via Surrogate Modelling - A Practical Guide. John Wiley & Sons Ltd.
- Peter I Frazier. 2018. A tutorial on Bayesian optimization. arXiv preprint arXiv:1807.02811 (2018).
- Roman Garnett. 2023. Bayesian Optimization. Cambridge University Press.
- Modeling and optimization with Gaussian processes in reduced eigenbases. Structural and Multidisciplinary Optimization 61, 6 (2020), 2343–2361.
- Ryan-Rhys Griffiths and José Miguel Hernández-Lobato. 2020. Constrained Bayesian Optimization for Automatic Chemical Design Using Variational Autoencoders. Chemical Science 11, 2 (Jan. 2020), 577–586. https://doi.org/10.1039/C9SC04026A
- Nikolaus Hansen. 2006. The CMA evolution strategy: a comparing review. Towards a new evolutionary computation (2006), 75–102.
- CMA-ES/pycma on Github. Zenodo, DOI: 10.5281/zenodo. 2559634.(Feb. 2019).
- COCO: A Platform for Comparing Continuous Optimizers in a Black-Box Setting. Optimization Methods and Software 36 (2021), 114–144. Issue 1. https://doi.org/10.1080/10556788.2020.1808977
- Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions. Technical Report RR-6829. INRIA. https://hal.inria.fr/inria-00362633/document
- Nikolaus Hansen and Stefan Kern. 2004. Evaluating the CMA evolution strategy on multimodal test functions. In International conference on parallel problem solving from nature. Springer, 282–291.
- Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evolutionary computation 11, 1 (2003), 1–18.
- Nikolaus Hansen and Andreas Ostermeier. 2001. Completely derandomized self-adaptation in evolution strategies. Evolutionary computation 9, 2 (2001), 159–195.
- Distributed Thompson Sampling for Large-scale Accelerated Exploration of Chemical Space. In NIPS Workshop on Bayesian Optimization.
- Parallel and Distributed Thompson Sampling for Large-scale Accelerated Exploration of Chemical Space. In Proceedings of the 34th International Conference on Machine Learning. PMLR, 1470–1479.
- An Evaluation of Sequential Model-Based Optimization for Expensive Blackbox Functions. In Proc. GECCO (Companion). ACM, 1209–1216. https://doi.org/10.1145/2464576.2501592
- Sequential Model-Based Optimization for General Algorithm Configuration. In Proceedings of the 5th International Conference on Learning and Intelligent Optimization (LION’05). Springer-Verlag, Berlin, Heidelberg, 507–523. https://doi.org/10.1007/978-3-642-25566-3_40
- Auto-Keras: An Efficient Neural Architecture Search System. https://doi.org/10.48550/arXiv.1806.10282 arXiv:1806.10282 [cs, stat]
- Improving Robotic Cooking Using Batch Bayesian Optimization. IEEE Robotics and Automation Letters 5, 2 (April 2020), 760–765. https://doi.org/10.1109/LRA.2020.2965418
- Fast Bayesian Optimization of Machine Learning Hyperparameters on Large Datasets. https://doi.org/10.48550/arXiv.1605.07079 arXiv:1605.07079 [cs, stat]
- Bayesian Optimization in Materials Science: A Survey. arXiv:2108.00002 [cond-mat, physics:physics] (July 2021). arXiv:2108.00002 [cond-mat, physics:physics]
- Advances in Bayesian Optimization with Applications in Aerospace Engineering. In 2018 AIAA Non-Deterministic Approaches Conference. American Institute of Aeronautics and Astronautics, Kissimmee, Florida. https://doi.org/10.2514/6.2018-1656
- Re-examining linear embeddings for high-dimensional bayesian optimization. Advances in neural information processing systems 33 (2020), 1546–1558.
- Bayesian Optimization for Automated Model Selection. In Advances in Neural Information Processing Systems, Vol. 29. Curran Associates, Inc.
- Bayesian Optimization in High-Dimensional Spaces: A Brief Survey. In International Conference on Information, Intelligence, Systems & Applications (IISA). IEEE, 1–8. https://doi.org/10.1109/IISA52424.2021.9555522
- An efficient methodology for modeling complex computer codes with Gaussian processes. Computational Statistics & Data Analysis 52, 10 (2008), 4731–4744.
- A Bayesian Exploration-Exploitation Approach for Optimal Online Sensing and Planning with a Visually Guided Mobile Robot. Autonomous Robots 27, 2 (Aug. 2009), 93–103. https://doi.org/10.1007/s10514-009-9130-2
- Jonas Mockus. 2012. Bayesian approach to global optimization: theory and applications. Vol. 37. Springer Science & Business Media.
- Vu Nguyen. 2019. Bayesian Optimization for Accelerating Hyper-Parameter Tuning. In 2019 IEEE Second International Conference on Artificial Intelligence and Knowledge Engineering (AIKE). 302–305. https://doi.org/10.1109/AIKE.2019.00060
- Jorge Nocedal and Stephen J Wright. 1999. Numerical optimization. Springer.
- Scikit-learn: Machine Learning in Python. Journal of Machine Learning Research 12 (2011), 2825–2830.
- Kriging-Assisted Topology Optimization of Crash Structures. Computer Methods in Applied Mechanics and Engineering 348 (May 2019), 730–752. https://doi.org/10.1016/j.cma.2019.02.002
- Methodology for Parameter Identification on a Thermoplastic Composite Crash Absorber by the Sequential Response Surface Method and Efficient Global Optimization. Composite Structures (Sept. 2021), 114646. https://doi.org/10.1016/j.compstruct.2021.114646
- High Dimensional Bayesian Optimization Assisted by Principal Component Analysis. In Proc. of Parallel Problem Solving from Nature (PPSN) (LNCS, Vol. 12269). Springer, 169–183. https://doi.org/10.1007/978-3-030-58112-1_12
- Sequential dimension reduction for learning features of expensive black-box functions. (2019).
- A two-population evolutionary algorithm for constrained optimization problems. In 2006 IEEE International Conference on Evolutionary Computation. IEEE, 1647–1653.
- Engineering Design via Surrogate Modelling: A Practical Guide. John Wiley & Sons.
- Bayesian Optimization Is Superior to Random Search for Machine Learning Hyperparameter Tuning: Analysis of the Black-Box Optimization Challenge 2020. https://doi.org/10.48550/arXiv.2104.10201 arXiv:2104.10201 [cs, stat]
- Pradnya A Vikhar. 2016. Evolutionary algorithms: A critical review and its future prospects. In 2016 International conference on global trends in signal processing, information computing and communication (ICGTSPICC). IEEE, 261–265.
- IOHanalyzer: Detailed Performance Analyses for Iterative Optimization Heuristics. ACM Trans. Evol. Learn. Optim. 2, 1 (2022), 3:1–3:29. https://doi.org/10.1145/3510426
- Batched large-scale Bayesian optimization in high-dimensional spaces. In International Conference on Artificial Intelligence and Statistics. PMLR, 745–754.
- Bayesian Optimization in a Billion Dimensions via Random Embeddings. arXiv:1301.1942 (2016).
- Sequential learning of active subspaces. arXiv preprint arXiv:1907.11572 (2019).
- Anatoly Zhigljavsky and Antanas Žilinskas. 2021. Bayesian and high-dimensional global optimization. Springer.
- Anatoly A Zhigljavsky. 2012. Theory of global random search. Vol. 65. Springer Science & Business Media.
- Maria Laura Santoni (4 papers)
- Elena Raponi (14 papers)
- Renato De Leone (5 papers)
- Carola Doerr (117 papers)