Title
Empirical study on weighted voting multiple classifiers
Abstract
Combining multiple classifiers is expected to increase classification accuracy. Research on combination strategies of multiple classifiers becomes a popular topic. For a crisp classifier, which returns a discrete class label instead of a set of real-valued probabilities respecting to every classes, the often used combination method is majority voting. Both majority and weighted majority voting are classifier-based voting schemes, which provide a certain base classifier with an identical confidence in voting. However, each classifier should have different voting priorities with respect to its learning space. This differences can not be reflected by classifier-based voting strategy. In this paper, we propose another two voting strategies in an effort to take such differences into consideration. We apply the AdaBoost algorithm to generate multiple classifiers and vary its voting strategy. Then, the prediction ability of each voting strategy is tested and compared on 8 datasets taken from UCI Machine Learning Repository. The experimental results show that one of the proposed voting strategies, namely sample-based voting scheme, achieves better performance in view of classification accuracy.
Year
DOI
Venue
2005
10.1007/11551188_36
ICAPR (1)
Keywords
Field
DocType
weighted majority voting,proposed voting strategy,weighted voting,classifier-based voting strategy,classification accuracy,different voting priority,voting strategy,multiple classifier,empirical study,majority voting,sample-based voting scheme,classifier-based voting scheme,machine learning
Adaboost algorithm,Anti-plurality voting,Voting,Computer science,Weighted voting,Sensor fusion,Artificial intelligence,Classifier (linguistics),Majority rule,Machine learning,Empirical research
Conference
Volume
ISSN
ISBN
3686
0302-9743
3-540-28757-4
Citations 
PageRank 
References 
4
1.00
13
Authors
3
Name
Order
Citations
PageRank
Yanmin Sun177021.67
Mohamed S. Kamel24523282.55
Andrew K. C. Wong34063518.39