Title
A privacy-preserving and non-interactive federated learning scheme for regression training with gradient descent
Abstract
In recent years, the extensive application of machine learning technologies has been witnessed in various fields. However, in many applications, massive data are distributively stored in multiple data owners. Meanwhile, due to the privacy concerns and communication constraints, it is difficult to bridge the data silos among data owners for training a global machine learning model. In this paper, we propose a privacy-preserving and non-interactive federated learning scheme for regression training with gradient descent, named VANE. With VANE, multiple data owners are able to train a global linear, ridge or logistic regression model with the assistance of cloud, while their private local training data can be well protected. Specifically, we first design a secure data aggregation algorithm, with which local training data from multiple data owners can be aggregated and trained to a global model without disclosing any private information. Meanwhile, benefit from our data pre-processing method, the whole training process is non-interactive, i.e., there is no interaction between data owners and the cloud. Detailed security analysis shows that VANE can well protect the local training data of data owners. The performance evaluation results demonstrate that the training performance of our VANE is around 103 times faster than existing schemes.
Year
DOI
Venue
2021
10.1016/j.ins.2020.12.007
Information Sciences
Keywords
DocType
Volume
Regression training,Privacy-preserving,Secure data aggregation,Gradient descent
Journal
552
ISSN
Citations 
PageRank 
0020-0255
3
0.37
References 
Authors
0
5
Name
Order
Citations
PageRank
Fengwei Wang193.51
Hui Zhu28317.00
Rongxing Lu35091301.87
Yandong Zheng4206.38
Hui Li5405.83