Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Efficient Multi-Objective Neural Architecture Search via Pareto Dominance-based Novelty Search (2407.20656v1)

Published 30 Jul 2024 in cs.NE and cs.LG

Abstract: Neural Architecture Search (NAS) aims to automate the discovery of high-performing deep neural network architectures. Traditional objective-based NAS approaches typically optimize a certain performance metric (e.g., prediction accuracy), overlooking large parts of the architecture search space that potentially contain interesting network configurations. Furthermore, objective-driven population-based metaheuristics in complex search spaces often quickly exhaust population diversity and succumb to premature convergence to local optima. This issue becomes more complicated in NAS when performance objectives do not fully align with the actual performance of the candidate architectures, as is often the case with training-free metrics. While training-free metrics have gained popularity for their rapid performance estimation of candidate architectures without incurring computation-heavy network training, their effective incorporation into NAS remains a challenge. This paper presents the Pareto Dominance-based Novelty Search for multi-objective NAS with Multiple Training-Free metrics (MTF-PDNS). Unlike conventional NAS methods that optimize explicit objectives, MTF-PDNS promotes population diversity by utilizing a novelty score calculated based on multiple training-free performance and complexity metrics, thereby yielding a broader exploration of the search space. Experimental results on standard NAS benchmark suites demonstrate that MTF-PDNS outperforms conventional methods driven by explicit objectives in terms of convergence speed, diversity maintenance, architecture transferability, and computational costs.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (43)
  1. Zero-Cost Proxies for Lightweight NAS. In 9th International Conference on Learning Representations, ICLR 2021, Virtual Event, Austria, May 3-7, 2021. OpenReview.net. https://openreview.net/forum?id=0cmMMy8J5q
  2. Understanding and Simplifying One-Shot Architecture Search. In Proceedings of the 35th International Conference on Machine Learning, ICML 2018, Stockholmsmässan, Stockholm, Sweden, July 10-15, 2018 (Proceedings of Machine Learning Research, Vol. 80), Jennifer G. Dy and Andreas Krause (Eds.). PMLR, 549–558. http://proceedings.mlr.press/v80/bender18a.html
  3. Julian Blank and Kalyanmoy Deb. 2020. Pymoo: Multi-Objective Optimization in Python. IEEE Access 8 (2020), 89497–89509. https://doi.org/10.1109/ACCESS.2020.2990567
  4. FreeREA: Training-Free Evolution-based Architecture Search. In IEEE/CVF Winter Conference on Applications of Computer Vision, WACV 2023, Waikoloa, HI, USA, January 2-7, 2023. IEEE, 1493–1502. https://doi.org/10.1109/WACV56688.2023.00154
  5. Current and Future Research Trends in Evolutionary Multiobjective Optimization. Springer London, London, 213–231. https://doi.org/10.1007/1-84628-117-2_15
  6. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6, 2 (2002), 182–197. https://doi.org/10.1109/4235.996017
  7. Novelty search: a theoretical perspective. In Proceedings of the Genetic and Evolutionary Computation Conference, GECCO 2019, Prague, Czech Republic, July 13-17, 2019, Anne Auger and Thomas Stützle (Eds.). ACM, 99–106. https://doi.org/10.1145/3321707.3321752
  8. Xuanyi Dong and Yi Yang. 2020. NAS-Bench-201: Extending the Scope of Reproducible Neural Architecture Search. In 8th International Conference on Learning Representations, ICLR 2020, Addis Ababa, Ethiopia, April 26-30, 2020. OpenReview.net. https://openreview.net/forum?id=HJxyZkBKDr
  9. Neural architecture search: A survey. The Journal of Machine Learning Research 20, 1 (2019), 1997–2017.
  10. Novelty search for global optimization. Appl. Math. Comput. 347 (2019), 865–881. https://doi.org/10.1016/J.AMC.2018.11.052
  11. Devising Effective Novelty Search Algorithms: A Comprehensive Empirical Study. In Proceedings of the Genetic and Evolutionary Computation Conference, GECCO 2015, Madrid, Spain, July 11-15, 2015, Sara Silva and Anna Isabel Esparcia-Alcázar (Eds.). ACM, 943–950. https://doi.org/10.1145/2739480.2754736
  12. Evolution of swarm robotics systems with novelty search. Swarm Intell. 7, 2-3 (2013), 115–144. https://doi.org/10.1007/s11721-013-0081-z
  13. Modified Distance Calculation in Generational Distance and Inverted Generational Distance. In Evolutionary Multi-Criterion Optimization - 8th International Conference, EMO 2015, Guimarães, Portugal, March 29 -April 1, 2015. Proceedings, Part II (Lecture Notes in Computer Science, Vol. 9019), António Gaspar-Cunha, Carlos Henggeler Antunes, and Carlos A. Coello Coello (Eds.). Springer, 110–125. https://doi.org/10.1007/978-3-319-15892-1_8
  14. NAS-Bench-Suite-Zero: Accelerating Research on Zero Cost Proxies. In NeurIPS. http://papers.nips.cc/paper_files/paper/2022/hash/b3835dd49b7d5bb062aecccc14d8a675-Abstract-Datasets_and_Benchmarks.html
  15. Snip: single-Shot Network Pruning based on Connection sensitivity. In 7th International Conference on Learning Representations, ICLR 2019, New Orleans, LA, USA, May 6-9, 2019. OpenReview.net. https://openreview.net/forum?id=B1VZqjAcYX
  16. Joel Lehman and Kenneth O. Stanley. 2011. Abandoning Objectives: Evolution Through the Search for Novelty Alone. Evol. Comput. 19, 2 (2011), 189–223. https://doi.org/10.1162/EVCO_a_00025
  17. Enhancements to constrained novelty search: two-population novelty search for generating game content. In Genetic and Evolutionary Computation Conference, GECCO ’13, Amsterdam, The Netherlands, July 6-10, 2013, Christian Blum and Enrique Alba (Eds.). ACM, 343–350. https://doi.org/10.1145/2463372.2463416
  18. Efficient guided evolution for neural architecture search. In GECCO ’22: Genetic and Evolutionary Computation Conference, Companion Volume, Boston, Massachusetts, USA, July 9 - 13, 2022, Jonathan E. Fieldsend and Markus Wagner (Eds.). ACM, 655–658. https://doi.org/10.1145/3520304.3528936
  19. NSGANetV2: Evolutionary Multi-objective Surrogate-Assisted Neural Architecture Search. In Computer Vision - ECCV 2020 - 16th European Conference, Glasgow, UK, August 23-28, 2020, Proceedings, Part I (Lecture Notes in Computer Science, Vol. 12346), Andrea Vedaldi, Horst Bischof, Thomas Brox, and Jan-Michael Frahm (Eds.). Springer, 35–51. https://doi.org/10.1007/978-3-030-58452-8_3
  20. NSGA-Net: Neural Architecture Search using Multi-Objective Genetic Algorithm (Extended Abstract). In Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence, IJCAI 2020, Christian Bessiere (Ed.). ijcai.org, 4750–4754. https://doi.org/10.24963/ijcai.2020/659
  21. Hoang N. Luong and Peter A. N. Bosman. 2012. Elitist Archiving for Multi-Objective Evolutionary Algorithms: To Adapt or Not to Adapt. In Parallel Problem Solving from Nature - PPSN XII - 12th International Conference, Taormina, Italy, September 1-5, 2012, Proceedings, Part II (Lecture Notes in Computer Science, Vol. 7492), Carlos A. Coello Coello, Vincenzo Cutello, Kalyanmoy Deb, Stephanie Forrest, Giuseppe Nicosia, and Mario Pavone (Eds.). Springer, 72–81. https://doi.org/10.1007/978-3-642-32964-7_8
  22. Lightweight multi-objective evolutionary neural architecture search with low-cost proxy metrics. Information Sciences 655 (2024), 119856. https://doi.org/10.1016/j.ins.2023.119856
  23. Neural Architecture Search without Training. CoRR abs/2006.04647 (2020). arXiv:2006.04647 https://arxiv.org/abs/2006.04647
  24. Evaluating Efficient Performance Estimators of Neural Architectures. In Advances in Neural Information Processing Systems 34: Annual Conference on Neural Information Processing Systems 2021, NeurIPS 2021, December 6-14, 2021, virtual, Marc’Aurelio Ranzato, Alina Beygelzimer, Yann N. Dauphin, Percy Liang, and Jennifer Wortman Vaughan (Eds.). 12265–12277. https://proceedings.neurips.cc/paper/2021/hash/65d90fc6d307590b14e9e1800d4e8eab-Abstract.html
  25. PRE-NAS: predictor-assisted evolutionary neural architecture search. In GECCO ’22: Genetic and Evolutionary Computation Conference, Boston, Massachusetts, USA, July 9 - 13, 2022, Jonathan E. Fieldsend and Markus Wagner (Eds.). ACM, 1066–1074. https://doi.org/10.1145/3512290.3528727
  26. Fast Evolutionary Neural Architecture Search by Contrastive Predictor with Linear Regions. In Proceedings of the Genetic and Evolutionary Computation Conference, GECCO 2023, Lisbon, Portugal, July 15-19, 2023, Sara Silva and Luís Paquete (Eds.). ACM, 1257–1266. https://doi.org/10.1145/3583131.3590452
  27. Efficient Neural Architecture Search via Parameter Sharing. In Proceedings of the 35th International Conference on Machine Learning, ICML 2018, Stockholmsmässan, Stockholm, Sweden, July 10-15, 2018 (Proceedings of Machine Learning Research, Vol. 80), Jennifer G. Dy and Andreas Krause (Eds.). PMLR, 4092–4101. http://proceedings.mlr.press/v80/pham18a.html
  28. Regularized Evolution for Image Classifier Architecture Search. In The Thirty-Third AAAI Conference on Artificial Intelligence, AAAI 2019, The Thirty-First Innovative Applications of Artificial Intelligence Conference, IAAI 2019, The Ninth AAAI Symposium on Educational Advances in Artificial Intelligence, EAAI 2019, Honolulu, Hawaii, USA, January 27 - February 1, 2019. AAAI Press, USA, 4780–4789. https://doi.org/10.1609/aaai.v33i01.33014780
  29. Nilotpal Sinha and Kuan-Wen Chen. 2021. Evolving neural architecture using one shot model. In GECCO ’21: Genetic and Evolutionary Computation Conference, Lille, France, July 10-14, 2021, Francisco Chicano and Krzysztof Krawiec (Eds.). ACM, 910–918. https://doi.org/10.1145/3449639.3459275
  30. Nilotpal Sinha and Kuan-Wen Chen. 2022a. Neural architecture search using progressive evolution. In GECCO ’22: Genetic and Evolutionary Computation Conference, Boston, Massachusetts, USA, July 9 - 13, 2022, Jonathan E. Fieldsend and Markus Wagner (Eds.). ACM, 1093–1101. https://doi.org/10.1145/3512290.3528707
  31. Nilotpal Sinha and Kuan-Wen Chen. 2022b. Novelty driven evolutionary neural architecture search. In GECCO ’22: Genetic and Evolutionary Computation Conference, Companion Volume, Boston, Massachusetts, USA, July 9 - 13, 2022, Jonathan E. Fieldsend and Markus Wagner (Eds.). ACM, 671–674. https://doi.org/10.1145/3520304.3528889
  32. Nilotpal Sinha and Kuan-Wen Chen. 2022c. Novelty driven evolutionary neural architecture search. In GECCO ’22: Genetic and Evolutionary Computation Conference, Companion Volume, Boston, Massachusetts, USA, July 9 - 13, 2022, Jonathan E. Fieldsend and Markus Wagner (Eds.). ACM, 671–674. https://doi.org/10.1145/3520304.3528889
  33. Kenneth O. Stanley and Joel Lehman. 2015. Why Greatness Cannot Be Planned - The Myth of the Objective. Springer. https://doi.org/10.1007/978-3-319-15524-1
  34. Pruning neural networks without any data by iteratively conserving synaptic flow. In Advances in Neural Information Processing Systems 33: Annual Conference on Neural Information Processing Systems 2020, NeurIPS 2020, December 6-12, 2020, virtual, Hugo Larochelle, Marc’Aurelio Ranzato, Raia Hadsell, Maria-Florina Balcan, and Hsuan-Tien Lin (Eds.).
  35. Accelerating Gene-pool Optimal Mixing Evolutionary Algorithm for Neural Architecture Search with Synaptic Flow. In Companion Proceedings of the Conference on Genetic and Evolutionary Computation, GECCO 2023, Companion Volume, Lisbon, Portugal, July 15-19, 2023, Sara Silva and Luís Paquete (Eds.). ACM, 85–86. https://doi.org/10.1145/3583133.3596438
  36. Neural Architecture Search: Insights from 1000 Papers. CoRR abs/2301.08727 (2023). https://doi.org/10.48550/arXiv.2301.08727 arXiv:2301.08727
  37. How Powerful are Performance Predictors in Neural Architecture Search?. In Advances in Neural Information Processing Systems 34: Annual Conference on Neural Information Processing Systems 2021, NeurIPS 2021, December 6-14, 2021, virtual, Marc’Aurelio Ranzato, Alina Beygelzimer, Yann N. Dauphin, Percy Liang, and Jennifer Wortman Vaughan (Eds.). 28454–28469. https://proceedings.neurips.cc/paper/2021/hash/ef575e8837d065a1683c022d2077d342-Abstract.html
  38. NAS-Bench-101: Towards Reproducible Neural Architecture Search. In Proceedings of the 36th International Conference on Machine Learning, ICML 2019, 9-15 June 2019, Long Beach, California, USA (Proceedings of Machine Learning Research, Vol. 97), Kamalika Chaudhuri and Ruslan Salakhutdinov (Eds.). PMLR, 7105–7114. http://proceedings.mlr.press/v97/ying19a.html
  39. NAS-Bench-1Shot1: Benchmarking and Dissecting One-shot Neural Architecture Search. In 8th International Conference on Learning Representations, ICLR 2020, Addis Ababa, Ethiopia, April 26-30, 2020. OpenReview.net. https://openreview.net/forum?id=SJx9ngStPH
  40. Overcoming Multi-Model Forgetting in One-Shot NAS With Diversity Maximization. In 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2020, Seattle, WA, USA, June 13-19, 2020. Computer Vision Foundation / IEEE, 7806–7815. https://doi.org/10.1109/CVPR42600.2020.00783
  41. One-Shot Neural Architecture Search via Novelty Driven Sampling. In Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence, IJCAI 2020, Christian Bessiere (Ed.). ijcai.org, 3188–3194. https://doi.org/10.24963/ijcai.2020/441
  42. Eckart Zitzler and Lothar Thiele. 1999. Multiobjective evolutionary algorithms: a comparative case study and the strength Pareto approach. IEEE Trans. Evol. Comput. 3, 4 (1999), 257–271. https://doi.org/10.1109/4235.797969
  43. Barret Zoph and Quoc V. Le. 2017. Neural Architecture Search with Reinforcement Learning. In 5th International Conference on Learning Representations, ICLR 2017, Toulon, France, April 24-26, 2017, Conference Track Proceedings. OpenReview.net. https://openreview.net/forum?id=r1Ue8Hcxg
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. An Vo (4 papers)
  2. Ngoc Hoang Luong (4 papers)