Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Rethinking the Number of Channels for the Convolutional Neural Network (1909.01861v1)

Published 4 Sep 2019 in cs.CV

Abstract: Latest algorithms for automatic neural architecture search perform remarkable but few of them can effectively design the number of channels for convolutional neural networks and consume less computational efforts. In this paper, we propose a method for efficient automatic architecture search which is special to the widths of networks instead of the connections of neural architecture. Our method, functionally incremental search based on function-preserving, will explore the number of channels rapidly while controlling the number of parameters of the target network. On CIFAR-10 and CIFAR-100 classification, our method using minimal computational resources (0.4~1.3 GPU-days) can discover more efficient rules of the widths of networks to improve the accuracy by about 0.5% on CIFAR-10 and a~2.33% on CIFAR-100 with fewer number of parameters. In particular, our method is suitable for exploring the number of channels of almost any convolutional neural network rapidly.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Hui Zhu (49 papers)
  2. Zhulin An (43 papers)
  3. Chuanguang Yang (36 papers)
  4. Xiaolong Hu (14 papers)
  5. Kaiqiang Xu (6 papers)
  6. Yongjun Xu (81 papers)
Citations (3)

Summary

We haven't generated a summary for this paper yet.