Title
Asynchronous Stochastic Gradient Descent over Decentralized Datasets
Abstract
Asynchronous stochastic gradient descent (ASGD) usually works in the centralized setting in which workers retrieve data from a shared training set. This paper focuses on decentralized scenarios where each worker only accesses a subset of the whole training set. We find that due to the heterogeneous properties of the decentralized setting, ASGD will optimize in wrong directions and thus obtain poor solutions. To tackle the issue, a novel algorithm DASGD is proposed for above setting. Our key idea is to form an asymptotically unbiased accurate gradient estimate through reweighting stochastic gradient based on importance sampling technique. Numerical results substantiate the performance of the proposed algorithm in the decentralized setting.
Year
DOI
Venue
2020
10.1109/ICCA51439.2020.9264316
2020 IEEE 16th International Conference on Control & Automation (ICCA)
Keywords
DocType
ISSN
decentralized scenarios,decentralized setting,ASGD,asynchronous stochastic gradient descent,decentralized datasets,centralized setting,asymptotical unbiased accurate gradient estimation,DASGD,importance sampling,data retrieval
Conference
1948-3449
ISBN
Citations 
PageRank 
978-1-7281-9094-5
0
0.34
References 
Authors
8
3
Name
Order
Citations
PageRank
Yubo Du100.68
Keyou You283150.16
Yilin Mo389151.51