Papers
Topics
Authors
Recent
2000 character limit reached

Convolutional Neural Network Simplification with Progressive Retraining (2101.04699v1)

Published 12 Jan 2021 in cs.LG and cs.CV

Abstract: Kernel pruning methods have been proposed to speed up, simplify, and improve explanation of convolutional neural network (CNN) models. However, the effectiveness of a simplified model is often below the original one. In this letter, we present new methods based on objective and subjective relevance criteria for kernel elimination in a layer-by-layer fashion. During the process, a CNN model is retrained only when the current layer is entirely simplified, by adjusting the weights from the next layer to the first one and preserving weights of subsequent layers not involved in the process. We call this strategy \emph{progressive retraining}, differently from kernel pruning methods that usually retrain the entire model after each simplification action -- e.g., the elimination of one or a few kernels. Our subjective relevance criterion exploits the ability of humans in recognizing visual patterns and improves the designer's understanding of the simplification process. The combination of suitable relevance criteria and progressive retraining shows that our methods can increase effectiveness with considerable model simplification. We also demonstrate that our methods can provide better results than two popular ones and another one from the state-of-the-art using four challenging image datasets.

Citations (3)

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Paper to Video (Beta)

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.