Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Model Uncertainty in Evolutionary Optimization and Bayesian Optimization: A Comparative Analysis (2403.14413v2)

Published 21 Mar 2024 in cs.NE and cs.LG

Abstract: Black-box optimization problems, which are common in many real-world applications, require optimization through input-output interactions without access to internal workings. This often leads to significant computational resources being consumed for simulations. Bayesian Optimization (BO) and Surrogate-Assisted Evolutionary Algorithm (SAEA) are two widely used gradient-free optimization techniques employed to address such challenges. Both approaches follow a similar iterative procedure that relies on surrogate models to guide the search process. This paper aims to elucidate the similarities and differences in the utilization of model uncertainty between these two methods, as well as the impact of model inaccuracies on algorithmic performance. A novel model-assisted strategy is introduced, which utilizes unevaluated solutions to generate offspring, leveraging the population-based search capabilities of evolutionary algorithm to enhance the effectiveness of model-assisted optimization. Experimental results demonstrate that the proposed approach outperforms mainstream Bayesian optimization algorithms in terms of accuracy and efficiency.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (22)
  1. M. Gasse, S. Bowly, Q. Cappart et al., “The machine learning for combinatorial optimization competition (ml4co): Results and insights,” in Proceedings of the NeurIPS 2021 Competitions and Demonstrations Track, ser. Proceedings of Machine Learning Research, vol. 176.   PMLR, 06–14 Dec 2022, pp. 220–231.
  2. H. Hao, A. Zhou, H. Qian, and H. Zhang, “Expensive multiobjective optimization by relation learning and prediction,” IEEE Transactions on Evolutionary Computation, vol. 26, no. 5, pp. 1157–1170, 2022.
  3. Y. Jin, “Surrogate-assisted evolutionary computation: Recent advances and future challenges,” Swarm and Evolutionary Computation, vol. 1, no. 2, pp. 61–70, 2011.
  4. C. W. Ahn, R. S. Ramakrishna, and D. E. Goldberg, “Real-coded bayesian optimization algorithm: Bringing the strength of BOA into the continuous world,” in Genetic and Evolutionary Computation - GECCO 2004, Genetic and Evolutionary Computation Conference, Seattle, WA, USA, June 26-30, 2004, Proceedings, Part I, ser. Lecture Notes in Computer Science, vol. 3102.   Springer, 2004, pp. 840–851.
  5. X. Wang, Y. Jin, S. Schmitt, and M. Olhofer, “Recent advances in bayesian optimization,” ACM Comput. Surv., vol. 55, no. 13s, pp. 287:1–287:36, 2023.
  6. D. Golovin, B. Solnik, S. Moitra, G. Kochanski, J. Karro, and D. Sculley, “Google vizier: A service for black-box optimization,” in Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Halifax, NS, Canada, August 13 - 17, 2017.   ACM, 2017, pp. 1487–1495.
  7. J. T. Wilson, F. Hutter, and M. P. Deisenroth, “Maximizing acquisition functions for bayesian optimization,” in Advances in Neural Information Processing Systems 31: Annual Conference on Neural Information Processing Systems 2018, NeurIPS 2018, December 3-8, 2018, Montréal, Canada, 2018, pp. 9906–9917.
  8. C. E. Rasmussen, “Gaussian processes in machine learning,” in Advanced Lectures on Machine Learning, ML Summer Schools 2003, Canberra, Australia, February 2-14, 2003, Tübingen, Germany, August 4-16, 2003, Revised Lectures, ser. Lecture Notes in Computer Science, vol. 3176.   Springer, 2003, pp. 63–71.
  9. L. Breiman, “Random forests,” Machine Learning, vol. 45, no. 1, pp. 5–32, 2001.
  10. J. Mockus, “On bayesian methods for seeking the extremum,” in Optimization Techniques, IFIP Technical Conference, Novosibirsk, USSR, July 1-7, 1974, ser. Lecture Notes in Computer Science, G. I. Marchuk, Ed., vol. 27.   Springer, 1974, pp. 400–404.
  11. A. Zhou, J. Sun, and Q. Zhang, “An estimation of distribution algorithm with cheap and expensive local search methods,” IEEE Trans. Evol. Comput., vol. 19, no. 6, pp. 807–822, 2015.
  12. F. Nogueira, “Bayesian Optimization: Open source constrained global optimization tool for Python,” 2014–. [Online]. Available: https://github.com/fmfn/BayesianOptimization
  13. A. Cowen-Rivers, W. Lyu, R. Tutunov, Z. Wang, A. Grosnit, R.-R. Griffiths, A. Maravel, J. Hao, J. Wang, J. Peters, and H. Bou Ammar, “Hebo: Pushing the limits of sample-efficient hyperparameter optimisation,” Journal of Artificial Intelligence Research, vol. 74, 07 2022.
  14. T. scikit-optimize contributors, “scikit-optimize: Sequential model-based optimization in python,” 2020. [Online]. Available: https://github.com/scikit-optimize/scikit-optimize
  15. N. Stander and K. Craig, “On the robustness of a simple domain reduction scheme for simulation‐based optimization,” Engineering Computations, vol. 19, no. 4, pp. 431–450, 2002.
  16. E. Brochu, M. W. Hoffman, and N. de Freitas, “Portfolio allocation for bayesian optimization,” 2011.
  17. F. Li, X. Cai, L. Gao, and W. Shen, “A surrogate-assisted multiswarm optimization algorithm for high-dimensional computationally expensive problems,” IEEE Trans. Cybern., vol. 51, no. 3, pp. 1390–1402, 2021.
  18. A. Zhou, J. Zhang, J. Sun, and G. Zhang, “Fuzzy-classification assisted solution preselection in evolutionary optimization,” in The Thirty-Third AAAI Conference on Artificial Intelligence, 2019, pp. 2403–2410.
  19. T. Chen and C. Guestrin, “Xgboost: A scalable tree boosting system,” in Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, August 13-17, 2016, B. Krishnapuram, M. Shah, A. J. Smola, C. C. Aggarwal, D. Shen, and R. Rastogi, Eds.   ACM, 2016, pp. 785–794.
  20. B. Liu, Q. Zhang, and G. G. E. Gielen, “A gaussian process surrogate model assisted evolutionary algorithm for medium scale expensive optimization problems,” IEEE Trans. Evol. Comput., vol. 18, no. 2, pp. 180–192, 2014.
  21. X. Yao, Y. Liu, and G. Lin, “Evolutionary programming made faster,” IEEE Transactions on Evolutionary Computation, vol. 3, no. 2, pp. 82–102, 1999.
  22. Y. Tian, R. Cheng, X. Zhang, and Y. Jin, “PlatEMO: A MATLAB Platform for Evolutionary Multi-Objective Optimization [Educational Forum],” IEEE Computational Intelligence Magazine, vol. 12, no. 4, pp. 73–87, 2017.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Hao Hao (19 papers)
  2. Xiaoqun Zhang (46 papers)
  3. Aimin Zhou (43 papers)
Citations (3)

Summary

We haven't generated a summary for this paper yet.