Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Convolutional Neural Network Pruning Using Filter Attenuation (2002.03299v1)

Published 9 Feb 2020 in cs.CV

Abstract: Filters are the essential elements in convolutional neural networks (CNNs). Filters are corresponded to the feature maps and form the main part of the computational and memory requirement for the CNN processing. In filter pruning methods, a filter with all of its components, including channels and connections, are removed. The removal of a filter can cause a drastic change in the network's performance. Also, the removed filters cannot come back to the network structure. We want to address these problems in this paper. We propose a CNN pruning method based on filter attenuation in which weak filters are not directly removed. Instead, weak filters are attenuated and gradually removed. In the proposed attenuation approach, weak filters are not abruptly removed, and there is a chance for these filters to return to the network. The filter attenuation method is assessed using the VGG model for the Cifar10 image classification task. Simulation results show that the filter attenuation works with different pruning criteria, and better results are obtained in comparison with the conventional pruning methods.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Morteza Mousa-Pasandi (1 paper)
  2. Mohsen Hajabdollahi (17 papers)
  3. Nader Karimi (79 papers)
  4. Shadrokh Samavi (91 papers)
  5. Shahram Shirani (37 papers)
Citations (3)

Summary

We haven't generated a summary for this paper yet.