Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Stable Low-rank Tensor Decomposition for Compression of Convolutional Neural Network (2008.05441v1)

Published 12 Aug 2020 in cs.CV

Abstract: Most state of the art deep neural networks are overparameterized and exhibit a high computational cost. A straightforward approach to this problem is to replace convolutional kernels with its low-rank tensor approximations, whereas the Canonical Polyadic tensor Decomposition is one of the most suited models. However, fitting the convolutional tensors by numerical optimization algorithms often encounters diverging components, i.e., extremely large rank-one tensors but canceling each other. Such degeneracy often causes the non-interpretable result and numerical instability for the neural network fine-tuning. This paper is the first study on degeneracy in the tensor decomposition of convolutional kernels. We present a novel method, which can stabilize the low-rank approximation of convolutional kernels and ensure efficient compression while preserving the high-quality performance of the neural networks. We evaluate our approach on popular CNN architectures for image classification and show that our method results in much lower accuracy degradation and provides consistent performance.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (9)
  1. Anh-Huy Phan (18 papers)
  2. Konstantin Sobolev (5 papers)
  3. Konstantin Sozykin (5 papers)
  4. Dmitry Ermilov (2 papers)
  5. Julia Gusak (13 papers)
  6. Petr Tichavsky (15 papers)
  7. Valeriy Glukhov (1 paper)
  8. Ivan Oseledets (187 papers)
  9. Andrzej Cichocki (73 papers)
Citations (113)

Summary

We haven't generated a summary for this paper yet.