Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Batch Normalization Tells You Which Filter is Important (2112.01155v2)

Published 2 Dec 2021 in cs.CV and cs.AI

Abstract: The goal of filter pruning is to search for unimportant filters to remove in order to make convolutional neural networks (CNNs) efficient without sacrificing the performance in the process. The challenge lies in finding information that can help determine how important or relevant each filter is with respect to the final output of neural networks. In this work, we share our observation that the batch normalization (BN) parameters of pre-trained CNNs can be used to estimate the feature distribution of activation outputs, without processing of training data. Upon observation, we propose a simple yet effective filter pruning method by evaluating the importance of each filter based on the BN parameters of pre-trained CNNs. The experimental results on CIFAR-10 and ImageNet demonstrate that the proposed method can achieve outstanding performance with and without fine-tuning in terms of the trade-off between the accuracy drop and the reduction in computational complexity and number of parameters of pruned networks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Junghun Oh (6 papers)
  2. Heewon Kim (12 papers)
  3. Sungyong Baik (17 papers)
  4. Cheeun Hong (6 papers)
  5. Kyoung Mu Lee (107 papers)
Citations (7)

Summary

We haven't generated a summary for this paper yet.