Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
162 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A First Step Towards Runtime Analysis of Evolutionary Neural Architecture Search (2401.11712v2)

Published 22 Jan 2024 in cs.NE

Abstract: Evolutionary neural architecture search (ENAS) employs evolutionary algorithms to find high-performing neural architectures automatically, and has achieved great success. However, compared to the empirical success, its rigorous theoretical analysis has yet to be touched. This work goes preliminary steps toward the mathematical runtime analysis of ENAS. In particular, we define a binary classification problem $\textsc{UNIFORM}$, and formulate an explicit fitness function to represent the relationship between neural architecture and classification accuracy. Furthermore, we consider (1+1)-ENAS algorithm with mutation to optimize the neural architecture, and obtain the following runtime bounds: both the local and global mutations find the optimum in an expected runtime of $\Theta(n)$, where $n$ is the problem size. The theoretical results show that the local and global mutations achieve nearly the same performance on $\textsc{UNIFORM}$. Empirical results also verify the equivalence of these two mutation operators.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (25)
  1. Theory of Randomized Search Heuristics - Foundations and Recent Developments. World Scientific, Singapore, 2011.
  2. Introduction to Linear Optimization. Athena Scientific, 1997.
  3. Training a 3-node neural network is NP-complete. Neural Networks, 5(1):117–127, 1992.
  4. Theory of Evolutionary Computation: Recent Developments in Discrete Optimization. Springer, Cham, Switzerland, 2020.
  5. Speeding up evolutionary algorithms through restricted mutation operators. In Proceedings of the 9th International Conference on Parallel Problem Solving from Nature (PPSN’06), pages 978–987, Reykjavik, Iceland, 2006.
  6. Comparing global and local mutations on bit strings. In Proceedings of the 10th Annual Conference on Genetic and Evolutionary Computation (GECCO’08), pages 929–936, Atlanta, GA, 2008.
  7. On the analysis of the (1+1) evolutionary algorithm. Theoretical Computer Science, 276(1-2):51–81, 2002.
  8. Computational complexity analysis of simple genetic programming on two problems modeling isolated program semantics. In Proceedings of the 11th Workshop on Foundations of Genetic Algorithms (FOGA’11), pages 69–80, Schwarzenberg, Austria, 2011.
  9. Neural architecture search: A survey. Journal of Machine Learning Research, 20(1):1997––2017, 2019.
  10. First steps towards a runtime analysis of neuroevolution. In Proceedings of the 17th International Workshop on Foundations of Genetic Algorithms (FOGA’23), pages 61–72, Potsdam, Germany, 2023.
  11. Drift analysis and average time complexity of evolutionary algorithms. Artificial Intelligence, 127(1):57–85, 2001.
  12. The max problem revisited: The importance of mutation in genetic programming. Theoretical Computer Science, 545:94–107, 2014.
  13. DARTS: Differentiable architecture search. arXiv preprint arXiv:1806.09055, 2018.
  14. A survey on evolutionary neural architecture search. IEEE Transactions on Neural Networks and Learning Systems, 34(2):550–570, 2023.
  15. Training feedforward neural networks using genetic algorithms. In Proceedings of the 11th International Joint Conference on Artificial Intelligence (IJCAI’89), pages 762–767, Detroit Michigan, 1989.
  16. Bioinspired Computation in Combinatorial Optimization - Algorithms and Their Computational Complexity. Springer, Berlin, Germany, 2010.
  17. Sequence selection by Pareto optimization. In Proceedings of the 27th International Joint Conference on Artificial Intelligence (IJCAI’18), pages 1485–1491, Stockholm, Sweden, 2018.
  18. Regularized evolution for image classifier architecture search. In Proceedings of the 33th AAAI Conference on Artificial Intelligence (AAAI’19), pages 4780–4789, Honolulu, Hawaii, USA, 2019.
  19. Evolving neural networks through augmenting topologies. Evolutionary Computation, 10(2):99–127, 2002.
  20. Dirk Sudholt. A new method for lower bounds on the running time of evolutionary algorithms. IEEE Transactions on Evolutionary Computation, 17(3):418–435, 2013.
  21. Completely automated CNN architecture design based on blocks. IEEE Transactions on Neural Networks and Learning Systems, 31(4):1242–1254, 2019.
  22. Ingo Wegener. Methods for the analysis of evolutionary algorithms on pseudo-Boolean functions. In Evolutionary Optimization, pages 349–369. Springer, 2003.
  23. Xin Yao. Evolving artificial neural networks. Proceedings of the IEEE, 87(9):1423–1447, 1999.
  24. Evolutionary Learning: Advances in Theories and Algorithms. Springer, Singapore, 2019.
  25. Neural architecture search with reinforcement learning. arXiv preprint arXiv:1611.01578, 2016.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com