Improving Ranking Correlation of Supernet with Candidates Enhancement and Progressive Training (2108.05866v1)
Abstract: One-shot neural architecture search (NAS) applies weight-sharing supernet to reduce the unaffordable computation overhead of automated architecture designing. However, the weight-sharing technique worsens the ranking consistency of performance due to the interferences between different candidate networks. To address this issue, we propose a candidates enhancement method and progressive training pipeline to improve the ranking correlation of supernet. Specifically, we carefully redesign the sub-networks in the supernet and map the original supernet to a new one of high capacity. In addition, we gradually add narrow branches of supernet to reduce the degree of weight sharing which effectively alleviates the mutual interference between sub-networks. Finally, our method ranks the 1st place in the Supernet Track of CVPR2021 1st Lightweight NAS Challenge.
- Ziwei Yang (23 papers)
- Ruyi Zhang (19 papers)
- Zhi Yang (188 papers)
- Xubo Yang (7 papers)
- Lei Wang (975 papers)
- Zheyang Li (10 papers)