Title
AdaCN: An Adaptive Cubic Newton Method for Nonconvex Stochastic Optimization
Abstract
In this work, we introduce AdaCN, a novel adaptive cubic Newton method for nonconvex stochastic optimization. AdaCN dynamically captures the curvature of the loss landscape by diagonally approximated Hessian plus the norm of difference between previous two estimates. It only requires at most first order gradients and updates with linear complexity for both time and memory. In order to reduce the variance introduced by the stochastic nature of the problem, AdaCN hires the first and second moment to implement and exponential moving average on iteratively updated stochastic gradients and approximated stochastic Hessians, respectively. We validate AdaCN in extensive experiments, showing that it outperforms other stochastic first order methods (including SGD, Adam, and AdaBound) and stochastic quasi-Newton method (i.e., Apollo), in terms of both convergence speed and generalization performance.
Year
DOI
Venue
2021
10.1155/2021/5790608
COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE
DocType
Volume
ISSN
Journal
2021
1687-5265
Citations 
PageRank 
References 
0
0.34
0
Authors
4
Name
Order
Citations
PageRank
Yan Liu1213.88
Maojun Zhang231448.74
Zhiwei Zhong300.34
Xiangrong Zeng400.34