Title | ||
---|---|---|
Grouping Synchronous to Eliminate Stragglers with Edge Computing in Distributed Deep Learning |
Abstract | ||
---|---|---|
With the development of artificial intelligence(AI) applications, a large number of data are generated from mobile or IoT devices at the edge of the network. Deep learning tasks are executed to obtain effective information in the user data. However, the edge nodes are heterogeneous and the network bandwidth is limited in this case, which will cause general distributed deep learning to be inefficie... |
Year | DOI | Venue |
---|---|---|
2021 | 10.1109/ISPA-BDCloud-SocialCom-SustainCom52081.2021.00066 | 2021 IEEE Intl Conf on Parallel & Distributed Processing with Applications, Big Data & Cloud Computing, Sustainable Computing & Communications, Social Computing & Networking (ISPA/BDCloud/SocialCom/SustainCom) |
Keywords | DocType | ISSN |
distributed deep training,gradient compression,parameter server,gradient sparsification | Conference | 2158-9178 |
ISBN | Citations | PageRank |
978-1-6654-3574-1 | 0 | 0.34 |
References | Authors | |
0 | 9 |
Name | Order | Citations | PageRank |
---|---|---|---|
Zhiyi Gui | 1 | 0 | 0.34 |
Yang Xiang | 2 | 2930 | 212.67 |
Hao Yang | 3 | 0 | 0.68 |
Wei Li | 4 | 0 | 0.34 |
Lei Zhang | 5 | 0 | 0.34 |
Qi Qi | 6 | 210 | 56.01 |
J. Wang | 7 | 479 | 95.23 |
Haifeng Sun | 8 | 68 | 27.77 |
Jianxin Liao | 9 | 457 | 82.08 |