Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

DCNAS: Densely Connected Neural Architecture Search for Semantic Image Segmentation (2003.11883v2)

Published 26 Mar 2020 in cs.CV, cs.LG, and eess.IV

Abstract: Neural Architecture Search (NAS) has shown great potentials in automatically designing scalable network architectures for dense image predictions. However, existing NAS algorithms usually compromise on restricted search space and search on proxy task to meet the achievable computational demands. To allow as wide as possible network architectures and avoid the gap between target and proxy dataset, we propose a Densely Connected NAS (DCNAS) framework, which directly searches the optimal network structures for the multi-scale representations of visual information, over a large-scale target dataset. Specifically, by connecting cells with each other using learnable weights, we introduce a densely connected search space to cover an abundance of mainstream network designs. Moreover, by combining both path-level and channel-level sampling strategies, we design a fusion module to reduce the memory consumption of ample search space. We demonstrate that the architecture obtained from our DCNAS algorithm achieves state-of-the-art performances on public semantic image segmentation benchmarks, including 84.3% on Cityscapes, and 86.9% on PASCAL VOC 2012. We also retain leading performances when evaluating the architecture on the more challenging ADE20K and Pascal Context dataset.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Xiong Zhang (28 papers)
  2. Hongmin Xu (10 papers)
  3. Hong Mo (3 papers)
  4. Jianchao Tan (24 papers)
  5. Cheng Yang (168 papers)
  6. Lei Wang (975 papers)
  7. Wenqi Ren (67 papers)
Citations (84)

Summary

We haven't generated a summary for this paper yet.