Abstract | ||
---|---|---|
To exploit the vast amount of distributed data across edge devices, Federated Learning (FL) has been proposed to learn a shared model by performing distributed training locally on participating devices and aggregating the local models into a global one. The existing FL algorithms suffer from accuracy loss due to that data samples across all devices are usually not independent and identically distributed (non-i.i.d.). Besides, devices might lose connection during the training process in wireless edge computing. Thus, we advocate one-shot Neural Architecture Search technique as a basis to propose a solution which can deal with non-i.i.d. problem and is robust to the intermittent connection issue. We adopt a large network as the global model which includes all the candidate network architectures. The non-i.i.d. problem is alleviated by two steps: (1) identify and train the candidate networks which are potentially high performance and trained with less bias using a heuristic sampling scheme; (2) search for the final model with the highest accuracy rate from the candidate networks. Experimental results show that the model trained by our proposed method is robust to non-i.i.d. problem and can achieve 84% reduced communication overhead compared with the baselines. |
Year | DOI | Venue |
---|---|---|
2021 | 10.1109/GLOBECOM46510.2021.9685909 | 2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM) |
DocType | ISSN | Citations |
Conference | 2334-0983 | 0 |
PageRank | References | Authors |
0.34 | 0 | 6 |
Name | Order | Citations | PageRank |
---|---|---|---|
Feifei Zhang | 1 | 0 | 0.34 |
JiDong Ge | 2 | 119 | 28.39 |
Chifong Wong | 3 | 6 | 1.46 |
Sheng Zhang | 4 | 44 | 15.62 |
Chuanyi Li | 5 | 0 | 0.34 |
Bin Luo | 6 | 66 | 21.04 |