Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Automated Dominative Subspace Mining for Efficient Neural Architecture Search (2210.17180v2)

Published 31 Oct 2022 in cs.CV

Abstract: Neural Architecture Search (NAS) aims to automatically find effective architectures within a predefined search space. However, the search space is often extremely large. As a result, directly searching in such a large search space is non-trivial and also very time-consuming. To address the above issues, in each search step, we seek to limit the search space to a small but effective subspace to boost both the search performance and search efficiency. To this end, we propose a novel Neural Architecture Search method via Dominative Subspace Mining (DSM-NAS) that finds promising architectures in automatically mined subspaces. Specifically, we first perform a global search, i.e ., dominative subspace mining, to find a good subspace from a set of candidates. Then, we perform a local search within the mined subspace to find effective architectures. More critically, we further boost search performance by taking well-designed/ searched architectures to initialize candidate subspaces. Experimental results demonstrate that DSM-NAS not only reduces the search cost but also discovers better architectures than state-of-the-art methods in various benchmark search spaces.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Yaofo Chen (14 papers)
  2. Yong Guo (67 papers)
  3. Daihai Liao (2 papers)
  4. Fanbing Lv (3 papers)
  5. Hengjie Song (3 papers)
  6. Mingkui Tan (124 papers)
  7. James Tin-Yau Kwok (14 papers)
Citations (2)