Title
Asynchronous Stochastic Gradient Descent Over Decentralized Datasets
Abstract
The computational efficiency of the asynchronous stochastic gradient descent (ASGD) against its synchronous version has been well documented in recent works. Unfortunately, it usually works only for the situation that all workers retrieve data from a shared dataset. As data get larger and more distributed, new ideas are urgently needed to maintain the efficiency of ASGD for decentralized training....
Year
DOI
Venue
2021
10.1109/TCNS.2021.3059848
IEEE Transactions on Control of Network Systems
Keywords
DocType
Volume
Training,Convergence,Machine learning algorithms,Computational modeling,Stochastic processes,Computer architecture,Delays
Journal
8
Issue
ISSN
Citations 
3
2325-5870
0
PageRank 
References 
Authors
0.34
0
2
Name
Order
Citations
PageRank
Yubo Du100.68
Keyou You283150.16