Title
ConfDTree: A Statistical Method for Improving Decision Trees.
Abstract
Decision trees have three main disadvantages: reduced performance when the training set is small; rigid decision criteria; and the fact that a single “uncharacteristic” attribute might “derail” the classification process. In this paper we present ConfDTree (Confidence-Based Decision Tree) — a post-processing method that enables decision trees to better classify outlier instances. This method, which can be applied to any decision tree algorithm, uses easy-to-implement statistical methods (confidence intervals and two-proportion tests) in order to identify hard-to-classify instances and to propose alternative routes. The experimental study indicates that the proposed post-processing method consistently and significantly improves the predictive performance of decision trees, particularly for small, imbalanced or multi-class datasets in which an average improvement of 5%~9% in the AUC performance is reported.
Year
DOI
Venue
2014
10.1007/s11390-014-1438-5
J. Comput. Sci. Technol.
Keywords
Field
DocType
decision tree, confidence interval, imbalanced dataset
Data mining,Decision tree,Multiple-criteria decision analysis,Computer science,Outlier,Artificial intelligence,Confidence interval,ID3 algorithm,Alternating decision tree,Machine learning,Decision tree learning,Incremental decision tree
Journal
Volume
Issue
ISSN
29
3
1860-4749
Citations 
PageRank 
References 
0
0.34
27
Authors
4
Name
Order
Citations
PageRank
Gilad Katz110615.61
Asaf Shabtai21176100.03
Lior Rokach32127142.59
Nir Ofek4807.69