Title
VF2Boost: Very Fast Vertical Federated Gradient Boosting for Cross-Enterprise Learning
Abstract
ABSTRACTWith the ever-evolving concerns on privacy protection, vertical federated learning (FL), where participants own non-overlapping features for the same set of instances, is becoming a heated topic since it enables multiple enterprises to strengthen the machine learning models collaboratively with privacy guarantees. Nevertheless, to achieve privacy preservation, vertical FL algorithms involve complicated training routines and time-consuming cryptography operations, leading to slow training speed. This paper explores the efficiency of the gradient boosting decision tree (GBDT) algorithm under the vertical FL setting. Specifically, we introduce VF^2Boost, a novel and efficient vertical federated GBDT system. Significant solutions are developed to tackle the major bottlenecks. First, to handle the deficiency caused by frequent mutual-waiting in federated training, we propose a concurrent training protocol to reduce the idle periods. Second, to speed up the cryptography operations, we analyze the characteristics of the algorithm and propose customized operations. Empirical results show that our system can be 12.8-18.9 times faster than the existing vertical federated implementations and support much larger datasets.
Year
DOI
Venue
2021
10.1145/3448016.3457241
International Conference on Management of Data
Keywords
DocType
ISSN
Vertical Federated Learning, Gradient Boosting Decision Tree
Conference
0730-8078
Citations 
PageRank 
References 
0
0.34
0
Authors
7
Name
Order
Citations
PageRank
Fangcheng Fu1133.93
Yingxia Shao221324.25
Lele Yu3706.93
Jiawei Jiang48914.60
Huanran Xue502.03
Yangyu Tao622.51
Bin Cui71843124.59