Title
Porcupine Neural Networks: (Almost) All Local Optima are Global.
Abstract
Neural networks have been used prominently in several machine learning and statistics applications. In general, the underlying optimization of neural networks is non-convex which makes their performance analysis challenging. In this paper, we take a novel approach to this problem by asking whether one can constrain neural network weights to make its optimization landscape have good theoretical properties while at the same time, be a good approximation for the unconstrained one. For two-layer neural networks, we provide affirmative answers to these questions by introducing Porcupine Neural Networks (PNNs) whose weight vectors are constrained to lie over a finite set of lines. We show that most local optima of PNN optimizations are global while we have a characterization of regions where bad local optimizers may exist. Moreover, our theoretical and empirical results suggest that an unconstrained neural network can be approximated using a polynomially-large PNN.
Year
Venue
Field
2017
arXiv: Machine Learning
Finite set,Local optimum,Stochastic neural network,Types of artificial neural networks,Artificial intelligence,Deep learning,Artificial neural network,Mathematics,Machine learning
DocType
Volume
Citations 
Journal
abs/1710.02196
5
PageRank 
References 
Authors
0.47
20
4
Name
Order
Citations
PageRank
Soheil Feizi111324.65
Hamid Javadi271.84
Jesse Zhang3104.58
David N. C. Tse42078246.17