Papers
Topics
Authors
Recent
2000 character limit reached

DCT-Conv: Coding filters in convolutional networks with Discrete Cosine Transform

Published 23 Jan 2020 in cs.NE and cs.CV | (2001.08517v4)

Abstract: Convolutional neural networks are based on a huge number of trained weights. Consequently, they are often data-greedy, sensitive to overtraining, and learn slowly. We follow the line of research in which filters of convolutional neural layers are determined on the basis of a smaller number of trained parameters. In this paper, the trained parameters define a frequency spectrum which is transformed into convolutional filters with Inverse Discrete Cosine Transform (IDCT, the same is applied in decompression from JPEG). We analyze how switching off selected components of the spectra, thereby reducing the number of trained weights of the network, affects its performance. Our experiments show that coding the filters with trained DCT parameters leads to improvement over traditional convolution. Also, the performance of the networks modified this way decreases very slowly with the increasing extent of switching off these parameters. In some experiments, a good performance is observed when even 99.9% of these parameters are switched off.

Citations (11)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.