Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Semi-Supervised Assessor of Neural Architectures (2005.06821v1)

Published 14 May 2020 in cs.CV and cs.LG

Abstract: Neural architecture search (NAS) aims to automatically design deep neural networks of satisfactory performance. Wherein, architecture performance predictor is critical to efficiently value an intermediate neural architecture. But for the training of this predictor, a number of neural architectures and their corresponding real performance often have to be collected. In contrast with classical performance predictor optimized in a fully supervised way, this paper suggests a semi-supervised assessor of neural architectures. We employ an auto-encoder to discover meaningful representations of neural architectures. Taking each neural architecture as an individual instance in the search space, we construct a graph to capture their intrinsic similarities, where both labeled and unlabeled architectures are involved. A graph convolutional neural network is introduced to predict the performance of architectures based on the learned representations and their relation modeled by the graph. Extensive experimental results on the NAS-Benchmark-101 dataset demonstrated that our method is able to make a significant reduction on the required fully trained architectures for finding efficient architectures.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (9)
  1. Yehui Tang (63 papers)
  2. Yunhe Wang (145 papers)
  3. Yixing Xu (25 papers)
  4. Hanting Chen (52 papers)
  5. Chunjing Xu (66 papers)
  6. Boxin Shi (64 papers)
  7. Chao Xu (283 papers)
  8. Qi Tian (314 papers)
  9. Chang Xu (323 papers)
Citations (63)