Title
FedSim: Similarity guided model aggregation for Federated Learning
Abstract
Federated Learning (FL) is a distributed machine learning approach in which clients contribute to learning a global model in a privacy preserved manner. Effective aggregation of client models is essential to create a generalised global model. To what extent a client is generalisable and contributing to this aggregation can be ascertained by analysing inter-client relationships. We use similarity between clients to model such relationships. We explore how similarity knowledge can be inferred from comparing client gradients, instead of inferring similarity on the basis of client data which violates the privacy-preserving constraint in FL. The similarity-guided FedSim algorithm, introduced in this paper, decomposes FL aggregation into local and global steps. Clients with similar gradients are clustered to provide local aggregations, which thereafter can be globally aggregated to ensure better coverage whilst reducing variance. Our comparative study also investigates the applicability of FedSim in both real-world datasets and on synthetic datasets where statistical heterogeneity can be controlled and studied systematically. A comparative study of FedSim with state-of-the-art FL baselines, FedAvg and FedProx, clearly shows significant performance gains. Our findings confirm that by exploiting latent inter-client similarities, FedSim’s performance is significantly better and more stable compared to both these baselines.
Year
DOI
Venue
2022
10.1016/j.neucom.2021.08.141
Neurocomputing
Keywords
DocType
Volume
Federated Learning,Model aggregation,Similarity,Clustering
Journal
483
ISSN
Citations 
PageRank 
0925-2312
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Chamath Palihawadana100.34
Nirmalie Wiratunga200.34
Anjana Wijekoon305.07
Harsha Kalutarage400.34