Title | ||
---|---|---|
Federated Learning Beyond the Star: Local D2D Model Consensus with Global Cluster Sampling |
Abstract | ||
---|---|---|
Federated learning has emerged as a popular technique for distributing model training across the network edge. Its learning architecture is conventionally a star topology between the devices and a central server. In this paper, we propose two timescale hybrid federated learning (TT-HF), which migrates to a more distributed topology via device-to-device (D2D) communications. In TT-HF, local model training occurs at devices via successive gradient iterations, and the synchronization process occurs at two timescales: (i) macro-scale, where global aggregations are carried out via device-server interactions, and (ii) micro-scale, where local aggregations are carried out via D2D cooperative consensus formation in different device clusters. Our theoretical analysis reveals how device, cluster, and network-level parameters affect the convergence of TT-HF, and leads to a set of conditions under which a convergence rate of O(1/t) is guaranteed. Experimental results demonstrate the improvements in convergence and utilization that can be obtained by TT-HF over state-of-the-art federated learning baselines. |
Year | DOI | Venue |
---|---|---|
2021 | 10.1109/GLOBECOM46510.2021.9685456 | 2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM) |
DocType | ISSN | Citations |
Conference | 2334-0983 | 0 |
PageRank | References | Authors |
0.34 | 0 | 5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Frank Po-Chen Lin | 1 | 2 | 1.42 |
Seyyedali Hosseinalipour | 2 | 1 | 1.03 |
Sheikh Shams Azam | 3 | 1 | 1.05 |
Christopher G. Brinton | 4 | 118 | 15.23 |
Nicolò Michelusi | 5 | 427 | 35.43 |