Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

HiveNAS: Neural Architecture Search using Artificial Bee Colony Optimization (2211.10250v2)

Published 18 Nov 2022 in cs.NE, cs.AI, and cs.LG

Abstract: The traditional Neural Network-development process requires substantial expert knowledge and relies heavily on intuition and trial-and-error. Neural Architecture Search (NAS) frameworks were introduced to robustly search for network topologies, as well as facilitate the automated development of Neural Networks. While some optimization approaches -- such as Genetic Algorithms -- have been extensively explored in the NAS context, other Metaheuristic Optimization algorithms have not yet been investigated. In this study, we evaluate the viability of Artificial Bee Colony optimization for Neural Architecture Search. Our proposed framework, HiveNAS, outperforms existing state-of-the-art Swarm Intelligence-based NAS frameworks in a fraction of the time.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (23)
  1. Designing neural networks using genetic algorithms. In ICGA, volume 89, pages 379–384, 1989.
  2. Neural architecture search with reinforcement learning. In ICLR, 2017.
  3. Learning transferable architectures for scalable image recognition. In 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 8697–8710, 2018. doi: 10.1109/CVPR.2018.00907.
  4. Hyperparameter optimization. In Automated machine learning, pages 3–33. Springer, Cham, 2019.
  5. Neural architecture search: A survey. Automated Machine Learning, page 63–77, 2019a. doi: 10.1007/978-3-030-05318-5_3.
  6. Neural architecture search based on particle swarm optimization. In 2019 3rd International Conference on Data Science and Business Analytics (ICDSBA), pages 319–324, 2019. doi: 10.1109/ICDSBA48748.2019.00073.
  7. Deepswarm: Optimising convolutional neural networks using swarm intelligence. In UK Workshop on Computational Intelligence, pages 119–130. Springer, 2019.
  8. A review on plastic artificial neural networks: Exploring the intersection between neural architecture search and continual learning. arXiv preprint arXiv:2206.05625, 2022.
  9. Simulated annealing for neural architecture search. In Advances in Neural Information Processing Systems (NeurIPS), 2021.
  10. A powerful and efficient algorithm for numerical function optimization: artificial bee colony (abc) algorithm. Journal of global optimization, 39(3):459–471, 2007.
  11. Artificial bee colony (abc) algorithm on training artificial neural networks. In 2007 IEEE 15th Signal Processing and Communications Applications, pages 1–4, 2007. doi: 10.1109/SIU.2007.4298679.
  12. A review on artificial bee colony and it’s engineering applications. Journal of Critical Reviews, 7(11):4097–4107, 2020.
  13. Channel pruning via automatic structure search. In Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence, IJCAI’20, 2021. ISBN 9780999241165.
  14. Improved regularization of convolutional neural networks with cutout. arXiv preprint arXiv:1708.04552, 2017.
  15. DARTS: Differentiable architecture search. In International Conference on Learning Representations, 2019.
  16. Exploring the loss landscape in neural architecture search. In Uncertainty in Artificial Intelligence, pages 654–664. PMLR, 2021.
  17. Enhanced neural architecture search using super learner and ensemble approaches. In 2021 2nd Asia Service Sciences and Software Engineering Conference, ASSE ’21, page 137–143, New York, NY, USA, 2021. Association for Computing Machinery. ISBN 9781450389082. doi: 10.1145/3456126.3456133. URL https://doi.org/10.1145/3456126.3456133.
  18. Designing neural network architectures using reinforcement learning. CoRR, abs/1611.02167, 2016.
  19. Progressive neural architecture search. In Proceedings of the European conference on computer vision (ECCV), pages 19–34, 2018.
  20. Path-level network transformation for efficient architecture search. In International Conference on Machine Learning, pages 678–687. PMLR, 2018.
  21. Efficient multi-objective neural architecture search via lamarckian evolution. In International Conference on Learning Representations, 2019b.
  22. Efficient neural architecture search via parameters sharing. In International conference on machine learning, pages 4095–4104. PMLR, 2018.
  23. Evaluating the search phase of neural architecture search. In International Conference on Learning Representations, 2020.
Citations (1)

Summary

We haven't generated a summary for this paper yet.