Abstract | ||
---|---|---|
COMET is a single-pass MapReduce algorithm for learning on large-scale data. It builds multiple random forest ensembles on distributed blocks of data and merges them into a mega-ensemble. This approach is appropriate when learning from massive-scale data that is too large to fit on a single machine. To get the best accuracy, IVoting should be used instead of bagging to generate the training subset for each decision tree in the random forest. Experiments with two large datasets (5GB and 50GB compressed) show that COMET compares favorably (in both accuracy and training time) to learning on a sub sample of data using a serial algorithm. Finally, we propose a new Gaussian approach for lazy ensemble evaluation which dynamically decides how many ensemble members to evaluate per data point, this can reduce evaluation cost by 100X or more. |
Year | DOI | Venue |
---|---|---|
2011 | 10.1109/ICDM.2011.39 | international conference on data mining |
Keywords | DocType | Volume |
evaluation cost,massive data,new gaussian approach,massive-scale data,data point,large datasets,large ensembles,best accuracy,multiple random forest ensemble,large-scale data,ensemble member,lazy ensemble evaluation,distributed processing,random forest,gaussian processes,learning artificial intelligence,data handling,decision trees,cluster computing,decision tree | Conference | abs/1103.2068 |
ISSN | Citations | PageRank |
ICDM 2011: Proceedings of the 2011 IEEE International Conference
on Data Mining, pp. 41-50, 2011 | 12 | 0.66 |
References | Authors | |
22 | 5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Justin Basilico | 1 | 179 | 14.28 |
M. Arthur Munson | 2 | 28 | 1.81 |
Tamara G. Kolda | 3 | 5079 | 262.60 |
Kevin R. Dixon | 4 | 65 | 9.43 |
W. Philip Kegelmeyer | 5 | 3498 | 146.54 |