Title
Building Weighted Classifier Ensembles Through Classifiers Pruning.
Abstract
Many theoretical or experimental studies has shown that ensemble learning is an effective technique to achieve better classification accuracy and stability than individual classifiers. In this paper, we propose a novel weighted classifier ensemble method through classifiers pruning with two stages. In the first stage, we use canonical correlation analysis (CCA) to model maximum correlation relationships between training data points and base classifiers. Based on such globally multi-linear projections, a sparse regression method is proposed to prune base classifiers so that each test data point will dynamically select a subset of classifiers to form a unique classifier ensemble, to decrease effects of noisy input data and incorrect classifiers in such a global view. In the second stage, the pruned classifiers are weighted locally by a fusion method, which utilizes the generalization ability of pruned classifiers among nearest neighbors of testing data points. By this way, each test data point can build a unique locally weighted classifier ensemble. Analysis of experimental results on several UCI data sets shows that the classification results of our method are better than other ensemble methods such as Random Forests, Majority Voting, AdaBoost and DREP.
Year
Venue
Field
2017
ICIMCS
Data set,AdaBoost,Pattern recognition,Canonical correlation,Computer science,Test data,Artificial intelligence,Majority rule,Random forest,Classifier (linguistics),Ensemble learning
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
8
4
Name
Order
Citations
PageRank
Chenwei Cai100.34
Dickson Keddy Wornyo222.05
Liangjun Wang374.21
Xiangjun Shen45013.58