Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Bonsai-Net: One-Shot Neural Architecture Search via Differentiable Pruners (2006.09264v3)

Published 12 Jun 2020 in cs.LG and stat.ML

Abstract: One-shot Neural Architecture Search (NAS) aims to minimize the computational expense of discovering state-of-the-art models. However, in the past year attention has been drawn to the comparable performance of naive random search across the same search spaces used by leading NAS algorithms. To address this, we explore the effects of drastically relaxing the NAS search space, and we present Bonsai-Net, an efficient one-shot NAS method to explore our relaxed search space. Bonsai-Net is built around a modified differential pruner and can consistently discover state-of-the-art architectures that are significantly better than random search with fewer parameters than other state-of-the-art methods. Additionally, Bonsai-Net performs simultaneous model search and training, dramatically reducing the total time it takes to generate fully-trained models from scratch.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Rob Geada (4 papers)
  2. Dennis Prangle (23 papers)
  3. Andrew Stephen McGough (18 papers)
Citations (5)

Summary

We haven't generated a summary for this paper yet.