Title | ||
---|---|---|
Minimizing Training Time of Distributed Machine Learning by Reducing Data Communication |
Abstract | ||
---|---|---|
Due to the additive property of most machine learning objective functions, the training can be distributed to multiple machines. Distributed machine learning is an efficient way to deal with the rapid growth of data volume at the cost of extra inter-machine communication. One common implementation is the parameter server system which contains two types of nodes: worker nodes, which are used for ca... |
Year | DOI | Venue |
---|---|---|
2021 | 10.1109/TNSE.2021.3073897 | IEEE Transactions on Network Science and Engineering |
Keywords | DocType | Volume |
Servers,Training,Machine learning,Machine learning algorithms,Data models,Resource management,Distributed databases | Journal | 8 |
Issue | ISSN | Citations |
2 | 2327-4697 | 1 |
PageRank | References | Authors |
0.34 | 0 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Yubin Duan | 1 | 5 | 4.47 |
Ning Wang | 2 | 21 | 7.96 |
Jie Wu | 3 | 8307 | 592.07 |