Title
Analysis on the Number of Linear Regions of Piecewise Linear Neural Networks
Abstract
Deep neural networks (DNNs) are shown to be excellent solutions to staggering and sophisticated problems in machine learning. A key reason for their success is due to the strong expressive power of function representation. For piecewise linear neural networks (PLNNs), the number of linear regions is a natural measure of their expressive power since it characterizes the number of linear pieces available to model complex patterns. In this article, we theoretically analyze the expressive power of PLNNs by counting and bounding the number of linear regions. We first refine the existing upper and lower bounds on the number of linear regions of PLNNs with rectified linear units (ReLU PLNNs). Next, we extend the analysis to PLNNs with general piecewise linear (PWL) activation functions and derive the exact maximum number of linear regions of single-layer PLNNs. Moreover, the upper and lower bounds on the number of linear regions of multilayer PLNNs are obtained, both of which scale polynomially with the number of neurons at each layer and pieces of PWL activation function but exponentially with the number of layers. This key property enables deep PLNNs with complex activation functions to outperform their shallow counterparts when computing highly complex and structured functions, which, to some extent, explains the performance improvement of deep PLNNs in classification and function fitting.
Year
DOI
Venue
2022
10.1109/TNNLS.2020.3028431
IEEE Transactions on Neural Networks and Learning Systems
Keywords
DocType
Volume
Deep learning,expressive power,linear region,piecewise linear neural network (PLNN)
Journal
33
Issue
ISSN
Citations 
2
2162-237X
0
PageRank 
References 
Authors
0.34
0
5
Name
Order
Citations
PageRank
Qiang Hu131.73
Hao Zhang252.79
Feifei Gao33093212.03
Chengwen Xing489173.77
An Jian-ping513528.23