Title
Ensemble of fast learning stochastic gradient boosting
Abstract
Boosting is one of the most popular and powerful learning algorithms. However, due to its sequential nature in model fitting, the computational time of boosting algorithm can be prohibitive for big data analysis. In this paper, we proposed a parallel framework for boosting algorithm, called Ensemble of Fast Learning Stochastic Gradient Boosting (EFLSGB). The proposed EFLSGB is well suited for parallel execution, and therefore, can substantially reduce the computational time. Analysis of simulated and real datasets demonstrates that EFLSGB achieves highly competitive prediction accuracy in comparison with gradient tree boosting.
Year
DOI
Venue
2022
10.1080/03610918.2019.1645170
COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION
Keywords
DocType
Volume
Bias-variance tradeoff, Boosting, Ensemble method, Parallel computing, Tree
Journal
51
Issue
ISSN
Citations 
1
0361-0918
0
PageRank 
References 
Authors
0.34
0
3
Name
Order
Citations
PageRank
Bin Li12269.07
Qingzhao Yu2121.88
Peng Lu312617.62