Title
Creating Ensembles of Classifiers
Abstract
Ensembles of classifiers offer promise in increasing overall classification accuracy. The availability of extremely large datasets has opened avenues for application of distributed and/or parallel learning to efficiently learn models of them. In this paper, distributed learning is done by training classifiers on disjoint subsets of the data. We examine a random partitioning method to create disjoint subsets and propose a more intelligent way of partitioning into disjoint subsets using clustering. It was observed that the intelligent method of partitioning generally performs better than random partitioning for our datasets. In both methods a significant gain in accuracy may be obtained by applying bagging to each of the disjoint subsets, creating multiple diverse classifiers. The significance of our finding is that a partition strategy for even small/moderate sized datasets when combined with bagging can yield better performance than applying a single learner using the entire dataset
Year
DOI
Venue
2001
10.1109/ICDM.2001.989568
San Jose, CA
Keywords
Field
DocType
tires,machine learning,computer science,clustering algorithms,application software,learning artificial intelligence,disjoint subsets,distributed computing,clustering,data mining,decision trees,bagging
Data mining,Decision tree,Parallel learning,Disjoint sets,Ensembles of classifiers,Computer science,Distributed learning,Classification tree analysis,Artificial intelligence,Cluster analysis,Application software,Machine learning
Conference
ISBN
Citations 
PageRank 
0-7695-1119-8
23
1.75
References 
Authors
4
3
Name
Order
Citations
PageRank
Nitesh Chawla17257345.79
Steven Eschrich28910.81
Lawrence O. Hall35543335.87