Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

NeuroLGP-SM: A Surrogate-assisted Neuroevolution Approach using Linear Genetic Programming (2403.19459v1)

Published 28 Mar 2024 in cs.NE and cs.AI

Abstract: Evolutionary algorithms are increasingly recognised as a viable computational approach for the automated optimisation of deep neural networks (DNNs) within artificial intelligence. This method extends to the training of DNNs, an approach known as neuroevolution. However, neuroevolution is an inherently resource-intensive process, with certain studies reporting the consumption of thousands of GPU days for refining and training a single DNN network. To address the computational challenges associated with neuroevolution while still attaining good DNN accuracy, surrogate models emerge as a pragmatic solution. Despite their potential, the integration of surrogate models into neuroevolution is still in its early stages, hindered by factors such as the effective use of high-dimensional data and the representation employed in neuroevolution. In this context, we address these challenges by employing a suitable representation based on Linear Genetic Programming, denoted as NeuroLGP, and leveraging Kriging Partial Least Squares. The amalgamation of these two techniques culminates in our proposed methodology known as the NeuroLGP-Surrogate Model (NeuroLGP-SM). For comparison purposes, we also code and use a baseline approach incorporating a repair mechanism, a common practice in neuroevolution. Notably, the baseline approach surpasses the renowned VGG-16 model in accuracy. Given the computational intensity inherent in DNN operations, a singular run is typically the norm. To evaluate the efficacy of our proposed approach, we conducted 96 independent runs. Significantly, our methodologies consistently outperform the baseline, with the SM model demonstrating superior accuracy or comparable results to the NeuroLGP approach. Noteworthy is the additional advantage that the SM approach exhibits a 25% reduction in computational requirements, further emphasising its efficiency for neuroevolution.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (34)
  1. Breakhis based breast cancer automatic diagnosis using deep learning: Taxonomy, survey and insights. Neurocomputing, 375:9–24, 2020.
  2. J. Bergstra and Y. Bengio. Random search for hyper-parameter optimization. Journal of machine learning research, 13(2), 2012.
  3. A. Bhosekar and M. Ierapetritou. Advances in surrogate based modeling, feasibility analysis, and optimization: A review. Computers & Chemical Engineering, 108:250–267, 2018.
  4. Improving kriging surrogates of high-dimensional design models by partial least squares dimension reduction. Structural and Multidisciplinary Optimization, 53:935–952, 2016.
  5. M. Brameier and W. Banzhaf. Linear genetic programming, volume 1. Springer, 2007.
  6. A tutorial on bayesian optimization of expensive cost functions, with application to active user modeling and hierarchical reinforcement learning. arXiv preprint arXiv:1012.2599, 2010.
  7. Dimensionality reduction strategies for cnn-based classification of histopathological images. In Intelligent Interactive Multimedia Systems and Services 2017 10, pages 21–30. Springer, 2018.
  8. Smote: synthetic minority over-sampling technique. Journal of artificial intelligence research, 16:321–357, 2002.
  9. Run, don’t walk: Chasing higher flops for faster neural networks. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 12021–12031, 2023.
  10. Introduction to Evolutionary Computing. Springer Verlag, 2003.
  11. Data-efficient neuroevolution with kernel-based surrogate models. In Proceedings of the genetic and evolutionary computation conference, pages 85–92, 2018.
  12. E. Galván and P. Mooney. Neuroevolution in deep neural networks: Current trends and future challenges. IEEE Transactions on Artificial Intelligence, 2:476–493, 2021.
  13. E. Galván and M. Schoenauer. Promoting semantic diversity in multi-objective genetic programming. In A. Auger and T. Stützle, editors, Proceedings of the Genetic and Evolutionary Computation Conference, GECCO 2019, Prague, Czech Republic, July 13-17, 2019, pages 1021–1029. ACM, 2019.
  14. E. Galván and F. Stapleton. Semantic-based distance approaches in multi-objective genetic programming. In 2020 IEEE Symposium Series on Computational Intelligence (SSCI), pages 149–156. IEEE, 2020.
  15. E. Galván and F. Stapleton. Evolutionary multi-objective optimisation in neurotrajectory prediction. Applied Soft Computing, 146:110693, 2023.
  16. Semantics in multi-objective genetic programming. Applied Soft Computing, 115:108143, 2022.
  17. B. Greenwood and T. McDonnell. Surrogate-assisted neuroevolution. In Proceedings of the Genetic and Evolutionary Computation Conference, pages 1048–1056, 2022.
  18. Prediction of neural network performance by phenotypic modeling. In Proceedings of the Genetic and Evolutionary Computation Conference Companion, pages 1576–1582, 2019.
  19. Y. Jin. Surrogate-assisted evolutionary computation: Recent advances and future challenges. Swarm and Evolutionary Computation, 1(2):61–70, 2011.
  20. Efficient global optimization of expensive black-box functions. Journal of Global optimization, 13:455–492, 1998.
  21. M. I. E. Khaldi and A. Draa. Surrogate-assisted evolutionary optimisation: a novel blueprint and a state of the art survey. Evolutionary Intelligence, pages 1–31, 2023.
  22. Deep learning. Nature, 521(7553):436–444, 2015.
  23. Evolving deep neural networks. In Artificial intelligence in the age of neural networks and brain computing, pages 293–312. Elsevier, 2019.
  24. K. Simonyan and A. Zisserman. Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556, 2014.
  25. Breast cancer histopathological image classification using convolutional neural networks. In 2016 international joint conference on neural networks (IJCNN), pages 2560–2567. IEEE, 2016.
  26. K. O. Stanley and R. Miikkulainen. Evolving neural networks through augmenting topologies. Evol. Comput., 10(2):99–127, June 2002.
  27. F. Stapleton and E. Galván. Semantic neighborhood ordering in multi-objective genetic programming based on decomposition. In 2021 IEEE Congress on Evolutionary Computation (CEC), pages 580–587. IEEE, 2021.
  28. F. Stapleton and E. Galván. Initial steps towards tackling high-dimensional surrogate modeling for neuroevolution using kriging partial least squares. In Proceedings of the Companion Conference on Genetic and Evolutionary Computation, GECCO ’23 Companion, page 83–84, New York, NY, USA, 2023. Association for Computing Machinery.
  29. F. Stapleton and E. Galván. NeuroLGP-SM: scalable Surrogate-Assisted neuroevolution for deep neural networks. In 2024 IEEE Congress on Evolutionary Computation (CEC) (CEC 2024), page 8.81, Yokohama, Japan, June 2024.
  30. Neuroevolutionary multi-objective approaches to trajectory prediction in autonomous vehicles. In Proceedings of the Genetic and Evolutionary Computation Conference Companion, GECCO ’22, page 675–678, New York, NY, USA, 2022. Association for Computing Machinery.
  31. Improving neuroevolution efficiency by surrogate model-based optimization with phenotypic distance kernels. In International Conference on the Applications of Evolutionary Computation (Part of EvoStar), pages 504–519. Springer, 2019.
  32. Surrogate-assisted evolutionary deep learning using an end-to-end random forest-based performance predictor. IEEE Transactions on Evolutionary Computation, 24(2):350–364, 2020.
  33. Evolving deep convolutional neural networks for image classification. IEEE Transactions on Evolutionary Computation, 24(2):394–407, 2019.
  34. Freeze-thaw bayesian optimization. arXiv preprint arXiv:1406.3896, 2014.
Citations (1)

Summary

We haven't generated a summary for this paper yet.