Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Convolutional Neural Network Compression through Generalized Kronecker Product Decomposition (2109.14710v2)

Published 29 Sep 2021 in cs.CV and cs.LG

Abstract: Modern Convolutional Neural Network (CNN) architectures, despite their superiority in solving various problems, are generally too large to be deployed on resource constrained edge devices. In this paper, we reduce memory usage and floating-point operations required by convolutional layers in CNNs. We compress these layers by generalizing the Kronecker Product Decomposition to apply to multidimensional tensors, leading to the Generalized Kronecker Product Decomposition (GKPD). Our approach yields a plug-and-play module that can be used as a drop-in replacement for any convolutional layer. Experimental results for image classification on CIFAR-10 and ImageNet datasets using ResNet, MobileNetv2 and SeNet architectures substantiate the effectiveness of our proposed approach. We find that GKPD outperforms state-of-the-art decomposition methods including Tensor-Train and Tensor-Ring as well as other relevant compression methods such as pruning and knowledge distillation.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Marawan Gamal Abdel Hameed (4 papers)
  2. Marzieh S. Tahaei (6 papers)
  3. Ali Mosleh (10 papers)
  4. Vahid Partovi Nia (40 papers)
Citations (19)

Summary

We haven't generated a summary for this paper yet.