Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A First Look at Kolmogorov-Arnold Networks in Surrogate-assisted Evolutionary Algorithms (2405.16494v1)

Published 26 May 2024 in cs.NE

Abstract: Surrogate-assisted Evolutionary Algorithm (SAEA) is an essential method for solving expensive expensive problems. Utilizing surrogate models to substitute the optimization function can significantly reduce reliance on the function evaluations during the search process, thereby lowering the optimization costs. The construction of surrogate models is a critical component in SAEAs, with numerous machine learning algorithms playing a pivotal role in the model-building phase. This paper introduces Kolmogorov-Arnold Networks (KANs) as surrogate models within SAEAs, examining their application and effectiveness. We employ KANs for regression and classification tasks, focusing on the selection of promising solutions during the search process, which consequently reduces the number of expensive function evaluations. Experimental results indicate that KANs demonstrate commendable performance within SAEAs, effectively decreasing the number of function calls and enhancing the optimization efficiency. The relevant code is publicly accessible and can be found in the GitHub repository.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (29)
  1. T. Sun, Y. Shao, H. Qian, X. Huang, and X. Qiu, “Black-box tuning for language-model-as-a-service,” in International Conference on Machine Learning.   PMLR, 2022, pp. 20 841–20 855.
  2. Y. Liu, Y. Sun, B. Xue, M. Zhang, G. G. Yen, and K. C. Tan, “A survey on evolutionary neural architecture search,” IEEE Transactions on Neural Networks and Learning Systems, vol. 34, no. 2, pp. 550–570, 2023.
  3. Y. Zhang, H. Hao, X. He, S. Gao, and A. Zhou, “Evolutionary retrosynthetic route planning,” 2023.
  4. H. Hao, A. Zhou, H. Qian, and H. Zhang, “Expensive multiobjective optimization by relation learning and prediction,” IEEE Transactions on Evolutionary Computation, vol. 26, no. 5, pp. 1157–1170, 2022.
  5. Q. Liu, F. Lanfermann, T. Rodemann, M. Olhofer, and Y. Jin, “Surrogate-assisted many-objective optimization of building energy management,” IEEE Computational Intelligence Magazine, vol. 18, no. 4, pp. 14–28, 2023.
  6. B. Liu, Q. Zhang, and G. G. Gielen, “A gaussian process surrogate model assisted evolutionary algorithm for medium scale expensive optimization problems,” IEEE Transactions on Evolutionary Computation, vol. 18, no. 2, pp. 180–192, 2013.
  7. L. Pan, C. He, Y. Tian, H. Wang, X. Zhang, and Y. Jin, “A classification-based surrogate-assisted evolutionary algorithm for expensive many-objective optimization,” IEEE Transactions on Evolutionary Computation, vol. 23, no. 1, pp. 74–88, 2018.
  8. H. Yu, Y. Tan, J. Zeng, C. Sun, and Y. Jin, “Surrogate-assisted hierarchical particle swarm optimization,” Information Sciences, vol. 454-455, pp. 59–72, 2018-07.
  9. H. Hao, A. Zhou, and H. Zhang, “An approximated domination relationship based on binary classifiers for evolutionary multiobjective optimization,” in IEEE Congress on Evolutionary Computation, CEC 2021, Kraków, Poland, June 28 - July 1, 2021.   IEEE, 2021, pp. 2427–2434. [Online]. Available: https://doi.org/10.1109/CEC45853.2021.9504781
  10. Z. Liu, Y. Wang, S. Vaidya, F. Ruehle, J. Halverson, M. Soljačić, T. Y. Hou, and M. Tegmark, “Kan: Kolmogorov-arnold networks,” 2024.
  11. J. Braun and M. Griebel, “On a constructive proof of kolmogorov’s superposition theorem,” Constructive approximation, vol. 30, pp. 653–675, 2009.
  12. C. J. Vaca-Rubio, L. Blanco, R. Pereira, and M. Caus, “Kolmogorov-arnold networks (kans) for time series analysis,” 2024.
  13. R. Genet and H. Inzirillo, “Tkan: Temporal kolmogorov-arnold networks,” 2024.
  14. W. Hua, “Graphkan: Graph kolmogorov-arnold networks,” https://github.com/WillHua127/GraphKAN-Graph-Kolmogorov-Arnold-Networks, 2024.
  15. D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning representations by back propagating errors,” Nature, vol. 323, no. 6088, pp. 533–536, 1986.
  16. H. Hao, J. Zhang, X. Lu, and A. Zhou, “Binary Relation Learning and Classifying for Preselection in Evolutionary Algorithms,” IEEE Transactions on Evolutionary Computation, vol. 24, no. 6, pp. 1125–1139, 2020-12.
  17. D. R. Jones, M. Schonlau, and W. J. Welch, “Efficient Global Optimization of Expensive Black-Box Functions,” Journal of Global Optimization, vol. 13, no. 4, pp. 455–492, 1998-12-01.
  18. B. Liu, Q. Zhang, and G. G. E. Gielen, “A Gaussian Process Surrogate Model Assisted Evolutionary Algorithm for Medium Scale Expensive Optimization Problems,” IEEE Transactions on Evolutionary Computation, vol. 18, no. 2, pp. 180–192, 2014-04.
  19. F. Li, X. Cai, L. Gao, and W. Shen, “A surrogate-assisted multiswarm optimization algorithm for high-dimensional computationally expensive problems,” IEEE transactions on cybernetics, vol. 51, no. 3, pp. 1390–1402, 2020.
  20. A. Zhou, J. Zhang, J. Sun, and G. Zhang, “Fuzzy-classification assisted solution preselection in evolutionary optimization,” in Proceedings of the AAAI conference on artificial intelligence, vol. 33, no. 01, 2019, pp. 2403–2410.
  21. F.-F. Wei, W.-N. Chen, Q. Yang, J. Deng, X.-N. Luo, H. Jin, and J. Zhang, “A classifier-assisted level-based learning swarm optimizer for expensive optimization,” IEEE Transactions on Evolutionary Computation, vol. 25, no. 2, pp. 219–233, 2020.
  22. H. Hao, X. Zhang, and A. Zhou, “Enhancing saeas with unevaluated solutions: a case study of relation model for expensive optimization,” Science China Information Sciences, vol. 67, no. 2, pp. 1–18, 2024.
  23. Y. Yuan and W. Banzhaf, “Expensive multiobjective evolutionary optimization assisted by dominance prediction,” IEEE Transactions on Evolutionary Computation, vol. 26, no. 1, pp. 159–173, 2021.
  24. Y. Wang, Z. Cai, and Q. Zhang, “Differential evolution with composite trial vector generation strategies and control parameters,” IEEE transactions on evolutionary computation, vol. 15, no. 1, pp. 55–66, 2011.
  25. H. Hao, X. Zhang, and A. Zhou, “Model uncertainty in evolutionary optimization and bayesian optimization: A comparative analysis,” 2024.
  26. T. Chen and C. Guestrin, “Xgboost: A scalable tree boosting system,” in Proceedings of the 22nd acm sigkdd international conference on knowledge discovery and data mining, 2016, pp. 785–794.
  27. G. Biau and E. Scornet, “A random forest guided tour,” Test, vol. 25, pp. 197–227, 2016.
  28. F. Nogueira, “Bayesian Optimization: Open source constrained global optimization tool for Python,” 2014–. [Online]. Available: https://github.com/fmfn/BayesianOptimization
  29. N. Stander and K. Craig, “On the robustness of a simple domain reduction scheme for simulation‐based optimization,” Engineering Computations, vol. 19, no. 4, pp. 431–450, 2002.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Hao Hao (19 papers)
  2. Xiaoqun Zhang (46 papers)
  3. Bingdong Li (7 papers)
  4. Aimin Zhou (43 papers)
Citations (4)
X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets