2000 character limit reached
On decision regions of narrow deep neural networks (1807.01194v4)
Published 3 Jul 2018 in cs.LG, cs.AI, cs.NE, and stat.ML
Abstract: We show that for neural network functions that have width less or equal to the input dimension all connected components of decision regions are unbounded. The result holds for continuous and strictly monotonic activation functions as well as for the ReLU activation function. This complements recent results on approximation capabilities by [Hanin 2017 Approximating] and connectivity of decision regions by [Nguyen 2018 Neural] for such narrow neural networks. Our results are illustrated by means of numerical experiments.
Sponsored by Paperpile, the PDF & BibTeX manager trusted by top AI labs.
Get 30 days freePaper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.