Title
On decision regions of narrow deep neural networks
Abstract
We show that for neural network functions that have width less or equal to the input dimension all connected components of decision regions are unbounded. The result holds for continuous and strictly monotonic activation functions as well as for the ReLU activation function. This complements recent results on approximation capabilities by Hanin and Sellke (2017) and connectivity of decision regions by Nguyen et al. (2018) for such narrow neural networks. Our results are illustrated by means of numerical experiments.
Year
DOI
Venue
2018
10.1016/j.neunet.2021.02.024
Neural Networks
Keywords
Field
DocType
Expressive power,Approximation by network functions,Neural networks,Decision regions,Width of neural networks
Monotonic function,Mathematical optimization,Artificial intelligence,Connected component,Artificial neural network,Deep neural networks,Mathematics
Journal
Volume
Issue
ISSN
140
1
0893-6080
Citations 
PageRank 
References 
2
0.38
1
Authors
3
Name
Order
Citations
PageRank
Hans-Peter Beise120.38
Steve Dias Da Cruz220.38
Udo Schröder320.38