Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Accel-NASBench: Sustainable Benchmarking for Accelerator-Aware NAS (2404.08005v2)

Published 9 Apr 2024 in cs.LG and eess.IV

Abstract: One of the primary challenges impeding the progress of Neural Architecture Search (NAS) is its extensive reliance on exorbitant computational resources. NAS benchmarks aim to simulate runs of NAS experiments at zero cost, remediating the need for extensive compute. However, existing NAS benchmarks use synthetic datasets and model proxies that make simplified assumptions about the characteristics of these datasets and models, leading to unrealistic evaluations. We present a technique that allows searching for training proxies that reduce the cost of benchmark construction by significant margins, making it possible to construct realistic NAS benchmarks for large-scale datasets. Using this technique, we construct an open-source bi-objective NAS benchmark for the ImageNet2012 dataset combined with the on-device performance of accelerators, including GPUs, TPUs, and FPGAs. Through extensive experimentation with various NAS optimizers and hardware platforms, we show that the benchmark is accurate and allows searching for state-of-the-art hardware-aware models at zero cost.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (19)
  1. Proxylessnas: Direct neural architecture search on target task and hardware. arXiv:1812.00332 (2018).
  2. A downsampled variant of imagenet as an alternative to the cifar datasets. arXiv preprint arXiv:1707.08819 (2017).
  3. Xuanyi Dong and Yi Yang. 2020. Nas-bench-201: Extending the scope of reproducible neural architecture search. arXiv preprint arXiv:2001.00326 (2020).
  4. Brp-nas: Prediction-based nas using gcns. Advances in Neural Information Processing Systems 33 (2020), 10480–10490.
  5. Suyog Gupta and Berkin Akin. 2020. Accelerator-aware neural network design using automl. arXiv preprint arXiv:2003.02838 (2020).
  6. Searching for mobilenetv3. In ICCV.
  7. Progressive growing of gans for improved quality, stability, and variation. arXiv preprint arXiv:1710.10196 (2017).
  8. CIFAR-10 dataset. https://www.cs.toronto.edu/~kriz/cifar.html.
  9. Hw-nas-bench: Hardware-aware neural architecture search benchmark. arXiv preprint arXiv:2103.10584 (2021).
  10. Liam Li and Ameet Talwalkar. 2020. Random search and reproducibility for neural architecture search. PMLR.
  11. SMAC3: A Versatile Bayesian Optimization Package for Hyperparameter Optimization. J. Mach. Learn. Res. 23, 54 (2022), 1–9.
  12. BOAH: A tool suite for multi-fidelity bayesian optimization & analysis of hyperparameters. arXiv preprint arXiv:1908.06756 (2019).
  13. Regularized evolution for image classifier architecture search. In Proceedings of the aaai conference on artificial intelligence, Vol. 33. 4780–4789.
  14. Nas-bench-301 and the case for surrogate benchmarks for neural architecture search. arXiv preprint arXiv:2008.09777 (2020).
  15. Mnasnet: Platform-aware neural architecture search for mobile. In CVPR.
  16. Mingxing Tan and Quoc Le. 2019. Efficientnet: Rethinking model scaling for convolutional neural networks. In ICML. PMLR.
  17. Fbnetv2: Differentiable neural architecture search for spatial and channel dimensions. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition.
  18. Nas-bench-101: Towards reproducible neural architecture search. In International Conference on Machine Learning. PMLR, 7105–7114.
  19. Barret Zoph and Quoc V. Le. 2017. Neural Architecture Search with Reinforcement Learning. arXiv:1611.01578 (2017).
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Afzal Ahmad (6 papers)
  2. Linfeng Du (7 papers)
  3. Zhiyao Xie (30 papers)
  4. Wei Zhang (1489 papers)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com