2000 character limit reached
Carbon-Efficient Neural Architecture Search (2307.04131v1)
Published 9 Jul 2023 in cs.LG and cs.AI
Abstract: This work presents a novel approach to neural architecture search (NAS) that aims to reduce energy costs and increase carbon efficiency during the model design process. The proposed framework, called carbon-efficient NAS (CE-NAS), consists of NAS evaluation algorithms with different energy requirements, a multi-objective optimizer, and a heuristic GPU allocation strategy. CE-NAS dynamically balances energy-efficient sampling and energy-consuming evaluation tasks based on current carbon emissions. Using a recent NAS benchmark dataset and two carbon traces, our trace-driven simulations demonstrate that CE-NAS achieves better carbon and search efficiency than the three baselines.
- Finite-time analysis of the multiarmed bandit problem. Machine learning 47, 2 (2002), 235–256.
- Enabling sustainable clouds: The case for virtualizing the energy system. In Proceedings of the ACM Symposium on Cloud Computing. 350–358.
- Understanding and Simplifying One-Shot Architecture Search. In Proceedings of the 35th International Conference on Machine Learning.
- Once for All: Train One Network and Specialize it for Efficient Deployment. In International Conference on Learning Representations. https://arxiv.org/pdf/1908.09791.pdf
- ProxylessNAS: Direct Neural Architecture Search on Target Task and Hardware. In International Conference on Learning Representations. https://arxiv.org/pdf/1812.00332.pdf
- Progressive DARTS: Bridging the Optimization Gap for NAS in the Wild. CoRR abs/1912.10952 (2019). arXiv:1912.10952 http://arxiv.org/abs/1912.10952
- Differentiable Expected Hypervolume Improvement for Parallel Multi-Objective Bayesian Optimization. arXiv preprint arXiv:2006.05078 (2020).
- Parallel bayesian optimization of multiple noisy objectives with expected hypervolume improvement. Advances in Neural Information Processing Systems 34 (2021), 2187–2200.
- Speeding up automatic hyperparameter optimization of deep neural networks by extrapolation of learning curves. In Twenty-fourth international joint conference on artificial intelligence.
- Xuanyi Dong and Yi Yang. 2020. NAS-Bench-201: Extending the Scope of Reproducible Neural Architecture Search. In International Conference on Learning Representations (ICLR). https://openreview.net/forum?id=HJxyZkBKDr
- NAS-FPN: Learning Scalable Feature Pyramid Architecture for Object Detection. CoRR abs/1904.07392 (2019). arXiv:1904.07392 http://arxiv.org/abs/1904.07392
- Fast bayesian optimization of machine learning hyperparameters on large datasets. In Artificial intelligence and statistics. PMLR, 528–536.
- {HW}-{NAS}-Bench: Hardware-Aware Neural Architecture Search Benchmark. In International Conference on Learning Representations. https://openreview.net/forum?id=_0kaDkv3dVf
- Progressive Neural Architecture Search. In European Conference on Computer Vision(ECCV).
- DARTS: Differentiable Architecture Search. In International Conference on Learning Representations(ICLR).
- CarbonCast: Multi-Day Forecasting of Grid Carbon Intensity. In Proceedings of the 9th ACM International Conference on Systems for Energy-Efficient Buildings, Cities, and Transportation (Boston, Massachusetts) (BuildSys ’22). Association for Computing Machinery, New York, NY, USA, 198–207. https://doi.org/10.1145/3563357.3564079
- Electricity Map. [n. d.]. Electricity Map. https://app.electricitymaps.com/map
- Efficient Neural Architecture Search via Parameter Sharing. In International Conference on Machine Learning(ICML).
- Regularized Evolution for Image Classifier Architecture Search. In Association for the Advancement of Artificial Intelligence(AAAI).
- Energy and policy considerations for deep learning in NLP. arXiv preprint arXiv:1906.02243 (2019).
- Learning search space partition for black-box optimization using monte carlo tree search. Advances in Neural Information Processing Systems 33 (2020), 19511–19522.
- Sample-Efficient Neural Architecture Search by Learning Action Space. CoRR abs/1906.06832 (2019). arXiv:1906.06832 http://arxiv.org/abs/1906.06832
- AlphaX: eXploring Neural Architectures with Deep Neural Networks and Monte Carlo Tree Search. CoRR abs/1903.11059 (2019). arXiv:1903.11059 http://arxiv.org/abs/1903.11059
- NAS-FCOS: Fast Neural Architecture Search for Object Detection. In IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
- KNAS: green neural architecture search. In International Conference on Machine Learning. PMLR, 11613–11625.
- {PC}-{DARTS}: Partial Channel Connections for Memory-Efficient Architecture Search. In International Conference on Learning Representations. https://openreview.net/forum?id=BJlS634tPr
- NAS-Bench-101: Towards Reproducible Neural Architecture Search. In Proceedings of the 36th International Conference on Machine Learning.
- Surrogate NAS Benchmarks: Going Beyond the Limited Search Spaces of Tabular NAS Benchmarks. In International Conference on Learning Representations. https://openreview.net/forum?id=OnpFa95RVqs
- Few-Shot Neural Architecture Search. In Proceedings of the 38th International Conference on Machine Learning (Proceedings of Machine Learning Research, Vol. 139). PMLR, 12707–12718. http://proceedings.mlr.press/v139/zhao21d.html
- Multi-objective Optimization by Learning Space Partition. In International Conference on Learning Representations. https://openreview.net/forum?id=FlwzVjfMryn
- Learning Transferable Architectures for Scalable Image Recognition. In Conference on Computer Vision and Pattern Recognition (CVPR).
- Yiyang Zhao (13 papers)
- Tian Guo (48 papers)