Title
Complexity of Training ReLU Neural Network.
Abstract
In this paper, we explore some basic questions on the complexity of training Neural networks with ReLU activation function. We show that it is NP-hard to train a two- hidden layer feedforward ReLU neural network. If dimension d of the data is fixed then we show that there exists a polynomial time algorithm for the same training problem. We also show that if sufficient over-parameterization is provided in the first hidden layer of ReLU neural network then there is a polynomial time algorithm which finds weights such that output of the over-parameterized ReLU neural network matches with the output of the given data
Year
DOI
Venue
2018
10.1016/J.DISOPT.2020.100620
arXiv: Computational Complexity
Field
DocType
Volume
Discrete mathematics,Existential quantification,Activation function,Artificial intelligence,Time complexity,Artificial neural network,Mathematics,Feed forward
Journal
abs/1809.10787
Citations 
PageRank 
References 
0
0.34
0
Authors
3
Name
Order
Citations
PageRank
Digvijay Boob111.70
Santanu S. Dey200.34
Guanghui Lan3121266.26