Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Information Bottleneck Theory on Convolutional Neural Networks (1911.03722v2)

Published 9 Nov 2019 in stat.ML and cs.LG

Abstract: Recent years, many researches attempt to open the black box of deep neural networks and propose a various of theories to understand it. Among them, Information Bottleneck (IB) theory claims that there are two distinct phases consisting of fitting phase and compression phase in the course of training. This statement attracts many attentions since its success in explaining the inner behavior of feedforward neural networks. In this paper, we employ IB theory to understand the dynamic behavior of convolutional neural networks (CNNs) and investigate how the fundamental features such as convolutional layer width, kernel size, network depth, pooling layers and multi-fully connected layer have impact on the performance of CNNs. In particular, through a series of experimental analysis on benchmark of MNIST and Fashion-MNIST, we demonstrate that the compression phase is not observed in all these cases. This shows us the CNNs have a rather complicated behavior than feedforward neural networks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Junjie Li (98 papers)
  2. Ding Liu (52 papers)
Citations (3)

Summary

We haven't generated a summary for this paper yet.