Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Depth-Width Trade-offs for Neural Networks via Topological Entropy (2010.07587v1)

Published 15 Oct 2020 in cs.LG

Abstract: One of the central problems in the study of deep learning theory is to understand how the structure properties, such as depth, width and the number of nodes, affect the expressivity of deep neural networks. In this work, we show a new connection between the expressivity of deep neural networks and topological entropy from dynamical system, which can be used to characterize depth-width trade-offs of neural networks. We provide an upper bound on the topological entropy of neural networks with continuous semi-algebraic units by the structure parameters. Specifically, the topological entropy of ReLU network with $l$ layers and $m$ nodes per layer is upper bounded by $O(l\log m)$. Besides, if the neural network is a good approximation of some function $f$, then the size of the neural network has an exponential lower bound with respect to the topological entropy of $f$. Moreover, we discuss the relationship between topological entropy, the number of oscillations, periods and Lipschitz constant.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Kaifeng Bu (46 papers)
  2. Yaobo Zhang (8 papers)
  3. Qingxian Luo (4 papers)
Citations (8)

Summary

We haven't generated a summary for this paper yet.