Title
LightAdam: Towards a Fast and Accurate Adaptive Momentum Online Algorithm
Abstract
Adaptive optimization algorithms enjoy fast convergence and have been widely exploited in pattern recognition and cognitively-inspired machine learning. These algorithms may however be of high computational cost and low generalization ability due to their projection steps. Such limitations make them difficult to be applied in big data analytics, which may typically be seen in cognitively inspired learning, e.g. deep learning tasks. In this paper, we propose a fast and accurate adaptive momentum online algorithm, called LightAdam, to alleviate the drawbacks of projection steps for the adaptive algorithms. The proposed algorithm substantially reduces computational cost for each iteration step by replacing high-order projection operators with one-dimensional linear searches. Moreover, we introduce a novel second-order momentum and engage dynamic learning rate bounds in the proposed algorithm, thereby obtaining a higher generalization ability than other adaptive algorithms. We theoretically analyze that our proposed algorithm has a guaranteed convergence bound, and prove that our proposed algorithm has better generalization capability as compared to Adam. We conduct extensive experiments on three public datasets for image pattern classification, and validate the computational benefit and accuracy performance of the proposed algorithm in comparison with other state-of-the-art adaptive optimization algorithms
Year
DOI
Venue
2022
10.1007/s12559-021-09985-9
Cognitive Computation
Keywords
DocType
Volume
Adaptive training algorithm, Convex optimization, Online learning, Projection-free
Journal
14
Issue
ISSN
Citations 
2
1866-9956
0
PageRank 
References 
Authors
0.34
10
5
Name
Order
Citations
PageRank
Yangfan Zhou123229.72
Kaizhu Huang2101083.94
Cheng Cheng300.34
Xuguang Wang400.34
Xin Liu53919320.56