Title
Online Federated Multitask Learning
Abstract
With the popular use of mobile devices, it becomes increasingly important to conduct analysis on distributed data collected from multiple devices. Federated learning is a distributed learning framework which takes advantage of the training data and computational ability of scattered mobile devices to learn prediction models, and multi-task learning infers personalized but shared models among devices. Some recent work has integrated federated and multi-task learning, but such approaches may be impractical and inefficient in the online scenario, e.g., when new mobile devices keep joining the mobile computing system. To address this challenge, we propose OFMTL, an online federated multi-task learning algorithm, which learns the model parameters for the new device without revisiting the data of existing devices. The model parameters are derived by an effective way that combines the information inferred from local data and information borrowed from existing models. Through extensive experiments on three real datasets, we show that the proposed OFMTL framework achieves comparable accuracy to the existing algorithms but with much smaller computation, transmission and storage cost.
Year
DOI
Venue
2019
10.1109/BigData47090.2019.9006060
2019 IEEE International Conference on Big Data (Big Data)
Keywords
Field
DocType
Federated learning,multi-task relationship learning,online learning
Training set,Mobile computing,Federated learning,Multi-task learning,Computer science,Distributed learning,Mobile device,Artificial intelligence,Predictive modelling,Machine learning,Computation
Conference
ISSN
ISBN
Citations 
2639-1589
978-1-7281-0859-9
1
PageRank 
References 
Authors
0.35
0
4
Name
Order
Citations
PageRank
Fenglong Ma137433.08
Fenglong Ma237433.08
Wenjun Jiang335624.25
Jing Gao42723131.05