Abstract | ||
---|---|---|
Federated learning (FL) is an emerging technique used to collaboratively train a machine-learning model using the data and computation resources of mobile devices without exposing private or sensitive user data. Appropriate incentive mechanisms that motivate the data and mobile-device owner to participate in FL is key to building a sustainable platform. However, it is difficult to evaluate the contribution levels of participants to determine appropriate rewards without large computation and communication overhead. This paper proposes a computation- and communication-efficient method of estimating participants contribution levels. The proposed method requires a single FL training process, which significantly reduces overhead. Performance evaluations are done using the MNIST dataset, showing that the proposed method estimates participant contributions accurately with 46-49% less computation overhead and no communication overhead, as compared to a naive estimation method. |
Year | DOI | Venue |
---|---|---|
2020 | 10.1109/GCWkshps50303.2020.9367484 | 2020 IEEE GLOBECOM WORKSHOPS (GC WKSHPS) |
Keywords | DocType | ISSN |
Federate Learning, Incentive Mechanism, Contribution Estimation, Contribution Metric | Conference | 2166-0069 |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Takayuki Nishio | 1 | 106 | 38.21 |
Ryoichi Shinkuma | 2 | 136 | 34.16 |
Narayan B. Mandayam | 3 | 1471 | 161.08 |