Title
DEAM: Adaptive Momentum with Discriminative Weight for Stochastic Optimization
Abstract
Optimization algorithms with momentum, e.g., (ADAM) helps accelerate SGD in parameter updating, which can minify the oscillations of parameters update route. However, the fixed momentum weight (e.g., β <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">1</sub> in ADAM) will propagate errors in momentum computing. Besides, such a hyperparameter can be extremely hard to tune in applications. In this paper, we introduce a novel optimization algorithm, namely Discriminative wEight on Adaptive Momentum (DEAM). DEAM proposes to compute the momentum weight automatically based on the discriminative angle. The momentum term weight will be assigned with an appropriate value which configures the influence of momentum in the current step. In addition, DEAM also contains a novel backtrack term, which restricts redundant updates when the correction of the last step is needed. The backtrack term can effectively adapt the learning rate and achieve the anticipatory update as well. Extensive experiments demonstrate that DEAM can achieve a faster convergence rate than the existing optimization algorithms in training various models. A full version of this paper can be accessed in [1].
Year
DOI
Venue
2020
10.1109/ASONAM49781.2020.9381367
2020 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining (ASONAM)
Keywords
DocType
ISSN
DEAM,adaptive momentum,discriminative weight,stochastic optimization,ADAM,parameter updating,fixed momentum weight,momentum computing,optimization algorithm,discriminative angle,momentum term weight,redundant updates,backtrack term
Conference
2473-9928
ISBN
Citations 
PageRank 
978-1-7281-1057-8
0
0.34
References 
Authors
10
3
Name
Order
Citations
PageRank
Jiyang Bai100.34
Yuxiang Ren200.34
Jiawei Zhang380672.17