Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
GPT-4o
Gemini 2.5 Pro Pro
o3 Pro
GPT-4.1 Pro
DeepSeek R1 via Azure Pro
2000 character limit reached

Dynamic Channel Pruning: Feature Boosting and Suppression (1810.05331v2)

Published 12 Oct 2018 in cs.CV

Abstract: Making deep convolutional neural networks more accurate typically comes at the cost of increased computational and memory resources. In this paper, we reduce this cost by exploiting the fact that the importance of features computed by convolutional layers is highly input-dependent, and propose feature boosting and suppression (FBS), a new method to predictively amplify salient convolutional channels and skip unimportant ones at run-time. FBS introduces small auxiliary connections to existing convolutional layers. In contrast to channel pruning methods which permanently remove channels, it preserves the full network structures and accelerates convolution by dynamically skipping unimportant input and output channels. FBS-augmented networks are trained with conventional stochastic gradient descent, making it readily available for many state-of-the-art CNNs. We compare FBS to a range of existing channel pruning and dynamic execution schemes and demonstrate large improvements on ImageNet classification. Experiments show that FBS can respectively provide $5\times$ and $2\times$ savings in compute on VGG-16 and ResNet-18, both with less than $0.6\%$ top-5 accuracy loss.

Citations (293)

Summary

  • The paper introduces FBS, a dynamic method that selectively boosts important convolutional channels while suppressing less critical ones.
  • It integrates auxiliary connections in CNNs to dynamically adjust channel saliency without removing channels permanently, preserving network integrity.
  • Empirical results show up to 5× computational savings with less than 0.6% drop in top-5 accuracy, underscoring its practical benefits for resource-constrained applications.

Dynamic Channel Pruning: Feature Boosting and Suppression

The paper focuses on a novel approach to improving the efficiency of convolutional neural networks (CNNs) in the face of increasing computational demands. The authors introduce Feature Boosting and Suppression (FBS), a dynamic channel pruning method that predictively amplifies important convolutional channels and strategically skips unimportant ones during runtime. This stands in contrast to traditional channel pruning techniques that irreversibly remove channels based on their average utility across datasets.

Summary of Methodology

The FBS method operates on the premise that the significance of different convolutional channels varies significantly depending on the input data. By adopting a dynamic pruning strategy, the proposed method enhances CNNs by retaining the full network structure, allowing them to dynamically adjust execution patterns in response to specific input characteristics.

  1. Dynamic Pruning Strategy:
    • FBS introduces auxiliary connections to existing convolutional layers that dynamically evaluate the importance of each channel. These connections use previous layer outputs to adjust channel saliencies dynamically.
  2. Implementation:
    • Existing CNN structures remain intact, allowing for backward compatibility and ease of integration with existing architectures. This ensures that FBS-augmented networks can be trained using conventional stochastic gradient descent (SGD) methods.
  3. Efficiency:
    • The method achieves substantial computational savings, demonstrating 5× and 2× reductions in computation respectively on VGG-16 and ResNet-18 networks, with a negligible loss in top-5 accuracy of less than 0.6%.

Experimental Results

Empirical evaluations reveal the efficacy of the FBS approach in both computational savings and accuracy retention across varied networks and datasets:

  • ImageNet Dataset: On VGG-16 and ResNet-18 architectures, FBS outperformed state-of-the-art channel pruning and dynamic execution methods. Specifically, FBS achieved competitive or better accuracy at significantly reduced computational costs, showing the potential for FBS in resource-constrained environments.
  • CIFAR-10 Experimentation: The experiments demonstrated the dynamic adjustment capability of FBS, with heatmaps visualizing selective channel evaluation sensitivity to input variability.

Theoretical and Practical Implications

Theoretically, the dynamic aspect of FBS challenges the static assumptions intrinsic to many existing pruning techniques. The ability to retain full network capabilities suggests new potential for adaptive neural network architectures and greater flexibility in real-time applications.

Practically, this approach benefits edge computing applications where resource constraints are critical and dynamic adaptability is desirable. The demonstrated reduction in memory access and peak memory usage further benefits deployment in varied environments, from mobile devices to cloud services.

Speculation on Future Directions

The introduction of FBS marks a shift towards more intelligent and adaptive channel pruning strategies that can lead to further exploration in:

  • Neural Architecture Search (NAS): Integrating FBS into NAS frameworks could enhance automatic model generation tailored to dynamic input characteristics.
  • Further Granularity in Pruning: Future research could examine more fine-grained prediction mechanisms or hybrid methods combining FBS with other sparsity-inducing techniques to optimize performance across more diverse network architectures.

Conclusion

In conclusion, the FBS approach leverages the dynamic nature of feature importance to substantially increase CNN efficiency without significant degradation in performance. This paper's contribution provides a promising direction for sustained developments in efficient, adaptable neural networks applicable to both academic research and industry demands.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.