Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Neural Architecture Search: Insights from 1000 Papers (2301.08727v2)

Published 20 Jan 2023 in cs.LG, cs.AI, and stat.ML

Abstract: In the past decade, advances in deep learning have resulted in breakthroughs in a variety of areas, including computer vision, natural language understanding, speech recognition, and reinforcement learning. Specialized, high-performing neural architectures are crucial to the success of deep learning in these areas. Neural architecture search (NAS), the process of automating the design of neural architectures for a given task, is an inevitable next step in automating machine learning and has already outpaced the best human-designed architectures on many tasks. In the past few years, research in NAS has been progressing rapidly, with over 1000 papers released since 2020 (Deng and Lindauer, 2021). In this survey, we provide an organized and comprehensive guide to neural architecture search. We give a taxonomy of search spaces, algorithms, and speedup techniques, and we discuss resources such as benchmarks, best practices, other surveys, and open-source libraries.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Colin White (34 papers)
  2. Mahmoud Safari (24 papers)
  3. Rhea Sukthanker (3 papers)
  4. Binxin Ru (24 papers)
  5. Thomas Elsken (11 papers)
  6. Arber Zela (22 papers)
  7. Debadeepta Dey (32 papers)
  8. Frank Hutter (177 papers)
Citations (96)

Summary

We haven't generated a summary for this paper yet.