Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

CircConv: A Structured Convolution with Low Complexity (1902.11268v1)

Published 28 Feb 2019 in cs.CV

Abstract: Deep neural networks (DNNs), especially deep convolutional neural networks (CNNs), have emerged as the powerful technique in various machine learning applications. However, the large model sizes of DNNs yield high demands on computation resource and weight storage, thereby limiting the practical deployment of DNNs. To overcome these limitations, this paper proposes to impose the circulant structure to the construction of convolutional layers, and hence leads to circulant convolutional layers (CircConvs) and circulant CNNs. The circulant structure and models can be either trained from scratch or re-trained from a pre-trained non-circulant model, thereby making it very flexible for different training environments. Through extensive experiments, such strong structure-imposing approach is proved to be able to substantially reduce the number of parameters of convolutional layers and enable significant saving of computational cost by using fast multiplication of the circulant tensor.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Siyu Liao (14 papers)
  2. Zhe Li (211 papers)
  3. Liang Zhao (353 papers)
  4. Qinru Qiu (36 papers)
  5. Yanzhi Wang (197 papers)
  6. Bo Yuan (151 papers)
Citations (14)

Summary

We haven't generated a summary for this paper yet.