Title
Boosting lite: handling larger datasets and slower base classifiers
Abstract
In this paper, we examine ensemble algorithms (Boosting Lite and Ivoting) that provide accuracy approximating a single classifier, but which require significantly fewer training examples. Such algorithms allow ensemble methods to operate on very large data sets or use very slow learning algorithms. Boosting Lite is compared with Ivoting, standard boosting, and building a single classifier. Comparisons are done on 11 data sets to which other approaches have been applied. We find that ensembles of support vector machines can attain higher accuracy with less data than ensembles of decision trees. We find that Ivoting may result in higher accuracy ensembles on some data sets, however Boosting Lite is generally able to indicate when boosting will increase overall accuracy.
Year
DOI
Venue
2007
10.1007/978-3-540-72523-7_17
MCS
Keywords
Field
DocType
slower base classifier,boosting lite,ensemble method,higher accuracy ensemble,overall accuracy,single classifier,large data set,larger datasets,decision tree,higher accuracy,ensemble algorithm,boosting,decision trees,support vector machines,support vector machine
Decision tree,Pattern recognition,Computer science,Support vector machine,Boosting (machine learning),Artificial intelligence,LPBoost,Margin classifier,Ensemble learning,BrownBoost,Machine learning,Gradient boosting
Conference
Volume
ISSN
Citations 
4472
0302-9743
2
PageRank 
References 
Authors
0.36
8
4
Name
Order
Citations
PageRank
Lawrence O. Hall15543335.87
Robert E. Banfield235817.16
Kevin W. Bowyer311121734.33
W. Philip Kegelmeyer43498146.54