Title
A Secure Federated Transfer Learning Framework
Abstract
Machine learning relies on the availability of vast amounts of data for training. However, in reality, data are mostly scattered across different organizations and cannot be easily integrated due to many legal and practical constraints. To address this important challenge in the field of machine learning, we introduce a new technique and framework, known as federated transfer learning (FTL), to improve statistical modeling under a data federation. FTL allows knowledge to be shared without compromising user privacy and enables complementary knowledge to be transferred across domains in a data federation, thereby enabling a target-domain party to build flexible and effective models by leveraging rich labels from a source domain. This framework requires minimal modifications to the existing model structure and provides the same level of accuracy as the nonprivacy-preserving transfer learning. It is flexible and can be effectively adapted to various secure multiparty machine learning tasks.
Year
DOI
Venue
2020
10.1109/MIS.2020.2988525
IEEE Intelligent Systems
Keywords
DocType
Volume
Federated Learning,Transfer Learning,Multi-party Computation,Secret Sharing,Homomorphic Encryption
Journal
35
Issue
ISSN
Citations 
4
1541-1672
12
PageRank 
References 
Authors
0.66
0
5
Name
Order
Citations
PageRank
Liu Yang131420.37
Yan Kang2120.66
Chaoping Xing3916110.47
Tianjian Chen4221.19
Qiang Yang517039875.69