Title
A two-stage subspace trust region approach for deep neural network training
Abstract
In this paper, we develop a novel second-order method for training feed-forward neural nets. At each iteration, we construct a quadratic approximation to the cost function in a low-dimensional subspace. We minimize this approximation inside a trust region through a two-stage procedure: first inside the embedded positive curvature subspace, followed by a gradient descent step. This approach leads to a fast objective function decay, prevents convergence to saddle points, and alleviates the need for manually tuning parameters. We show the good performance of the proposed algorithm on benchmark datasets.
Year
DOI
Venue
2018
10.23919/EUSIPCO.2017.8081215
2017 25th European Signal Processing Conference (EUSIPCO)
Keywords
DocType
Volume
Deep learning,second-order approach,noncon-vex optimization,trust region,subspace method
Journal
abs/1805.09430
ISSN
ISBN
Citations 
2076-1465
978-1-5386-0751-0
0
PageRank 
References 
Authors
0.34
14
5
Name
Order
Citations
PageRank
Viacheslav Dudar100.34
Giovanni Chierchia217614.74
Emilie Chouzenoux320226.37
Jean-Christophe Pesquet41811.52
Vladimir Semenov500.34