Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Colony-Enhanced Recurrent Neural Architecture Search: Collaborative Ant-Based Optimization (2401.17480v1)

Published 30 Jan 2024 in cs.NE

Abstract: Crafting neural network architectures manually is a formidable challenge often leading to suboptimal and inefficient structures. The pursuit of the perfect neural configuration is a complex task, prompting the need for a metaheuristic approach such as Neural Architecture Search (NAS). Drawing inspiration from the ingenious mechanisms of nature, this paper introduces Collaborative Ant-based Neural Topology Search (CANTS-N), pushing the boundaries of NAS and Neural Evolution (NE). In this innovative approach, ant-inspired agents meticulously construct neural network structures, dynamically adapting within a dynamic environment, much like their natural counterparts. Guided by Particle Swarm Optimization (PSO), CANTS-N's colonies optimize architecture searches, achieving remarkable improvements in mean squared error (MSE) over established methods, including BP-free CANTS, BP CANTS, and ANTS. Scalable, adaptable, and forward-looking, CANTS-N has the potential to reshape the landscape of NAS and NE. This paper provides detailed insights into its methodology, results, and far-reaching implications.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (27)
  1. An ant colony optimization approach to the probabilistic traveling salesman problem. In International Conference on Parallel Problem Solving from Nature (2002), Springer, pp. 883–892.
  2. Deepswarm: Optimising convolutional neural networks using swarm intelligence. In Advances in Computational Intelligence Systems: Contributions Presented at the 19th UK Workshop on Computational Intelligence, September 4-6, 2019, Portsmouth, UK 19 (2020), Springer, pp. 119–130.
  3. Handling multiple objectives with particle swarm optimization. IEEE Transactions on Evolutionary Computation 8, 3 (2004), 256–279.
  4. mpi4py: Status update after 12 years of development. Computing in Science & Engineering 23, 4 (2021), 47–54.
  5. Parallel distributed computing using python. Advances in Water Resources 34, 9 (2011), 1124–1139. New Computational Methods and Software Tools.
  6. Mpi for python. Journal of Parallel and Distributed Computing 65, 9 (2005), 1108–1115.
  7. Mpi for python: Performance improvements and mpi-2 extensions. Journal of Parallel and Distributed Computing 68, 5 (2008), 655–662.
  8. Ant colony system: a cooperative learning approach to the traveling salesman problem. IEEE Transactions on evolutionary computation 1, 1 (1997), 53–66.
  9. ElSaid, A. Continuous ant-based neural topology search. Software Impacts 19 (Mar 2024), 100615.
  10. Optimizing long short-term memory recurrent neural networks using ant colony optimization to predict turbine engine vibration. Applied Soft Computing 73 (2018), 969–991.
  11. Continuous ant-based neural topology search. In International Conference on the Applications of Evolutionary Computation (Part of EvoStar) (2021), Springer, pp. 291–306.
  12. Ant-based neural topology search (ants) for optimizing recurrent networks. In International Conference on the Applications of Evolutionary Computation (Part of EvoStar) (2020), Springer, pp. 626–641.
  13. Backpropagation-free 4d continuous ant-based neural topology search. Applied Soft Computing 147 (2023), 110737.
  14. Optimizing lstm rnns using aco to predict turbine engine vibration. In Proceedings of the Genetic and Evolutionary Computation Conference Companion (2017), ACM, pp. 21–22.
  15. The ant swarm neuro-evolution procedure for optimizing recurrent networks, 2019.
  16. Gordon, D. M. Ant encounters: interaction networks and colony behavior, vol. 1. Princeton University Press, 2010.
  17. Efficient network architecture search via multiobjective particle swarm optimization based on decomposition. Neural Networks 123 (2020), 305–316.
  18. Neural architecture search using particle swarm and ant colony optimization.
  19. A survey on evolutionary neural architecture search. IEEE transactions on neural networks and learning systems (2021).
  20. M. Dorigo and L. M. Gambardella. Ant colonies for the travelling sales man problem. BioSystems, 43(2):73–81 (1997).
  21. Parallel ant colony optimization for the traveling salesman problem. In International Workshop on Ant Colony Optimization and Swarm Intelligence (2006), Springer, pp. 224–234.
  22. Active inference: the free energy principle in mind, brain, and behavior. MIT Press, 2022.
  23. Improving pso-based multi-objective optimization using crowding, mutation and ϵitalic-ϵ\epsilonitalic_ϵ-dominance. In International conference on evolutionary multi-criterion optimization (2005), Springer, pp. 505–519.
  24. Ant colony optimization for continuous domains. European journal of operational research 185, 3 (2008), 1155–1173.
  25. Ueltzhöffer, K. Deep active inference. Biological cybernetics 112, 6 (2018), 547–573.
  26. Evolving deep neural networks by multi-objective particle swarm optimization for image classification. In Proceedings of the genetic and evolutionary computation conference (2019), pp. 490–498.
  27. A multi-objective particle swarm optimization for neural networks pruning. In 2019 IEEE Congress on Evolutionary Computation (CEC) (2019), IEEE, pp. 570–577.

Summary

We haven't generated a summary for this paper yet.