Title
PID Controller-Based Stochastic Optimization Acceleration for Deep Neural Networks
Abstract
Deep neural networks (DNNs) are widely used and demonstrated their power in many applications, such as computer vision and pattern recognition. However, the training of these networks can be time consuming. Such a problem could be alleviated by using efficient optimizers. As one of the most commonly used optimizers, stochastic gradient descent-momentum (SGD-M) uses past and present gradients for parameter updates. However, in the process of network training, SGD-M may encounter some drawbacks, such as the overshoot phenomenon. This problem would slow the training convergence. To alleviate this problem and accelerate the convergence of DNN optimization, we propose a proportional-integral-derivative (PID) approach. Specifically, we investigate the intrinsic relationships between the PID-based controller and SGD-M first. We further propose a PID-based optimization algorithm to update the network parameters, where the past, current, and change of gradients are exploited. Consequently, our proposed PID-based optimization alleviates the overshoot problem suffered by SGD-M. When tested on popular DNN architectures, it also obtains up to 50% acceleration with competitive accuracy. Extensive experiments about computer vision and natural language processing demonstrate the effectiveness of our method on benchmark data sets, including CIFAR10, CIFAR100, Tiny-ImageNet, and PTB. We have released the code at https://github.com/tensorboy/PIDOptimizer.
Year
DOI
Venue
2020
10.1109/TNNLS.2019.2963066
IEEE Transactions on Neural Networks and Learning Systems
Keywords
DocType
Volume
Algorithms,Databases, Factual,Deep Learning,Natural Language Processing,Neural Networks, Computer,Stochastic Processes,Visual Prosthesis
Journal
31
Issue
ISSN
Citations 
12
2162-237X
0
PageRank 
References 
Authors
0.34
11
6
Name
Order
Citations
PageRank
Wang H17129.35
Yi Luo200.34
An Wangpeng3122.52
Qingyun Sun462.49
Jun Xu51569.95
Lei Zhang616326543.99