Title
Entropy evaluation based on confidence intervals of frequency estimates : Application to the learning of decision trees
Abstract
Entropy gain is widely used for learning decision trees. However, as we go deeper downward the tree, the examples become rarer and the faithfulness of entropy decreases. Thus, misleading choices and over-fitting may occur and the tree has to be adjusted by using an early-stop criterion or post pruning algorithms. However, these methods still depends on the choices previously made, which may be unsatisfactory. We propose a new cumulative entropy function based on confidence intervals on frequency estimates that together considers the entropy of the probability distribution and the uncertainty around the estimation of its parameters. This function takes advantage of the ability of a possibility distribution to upper bound a family of probabilities previously estimated from a limited set of examples and of the link between possibilistic specificity order and entropy. The proposed measure has several advantages over the classical one. It performs significant choices of split and provides a statistically relevant stopping criterion that allows the learning of trees whose size is wellsuited w.r.t. the available data. On the top of that, it also provides a reasonable estimator of the performances of a decision tree. Finally, we show that it can be used for designing a simple and efficient online learning algorithm.
Year
Venue
Field
2015
International Conference on Machine Learning
Decision tree,Maximum entropy spectral estimation,Computer science,Upper and lower bounds,Binary entropy function,Probability distribution,Artificial intelligence,Principle of maximum entropy,Confidence interval,Statistics,Machine learning,Estimator
DocType
Citations 
PageRank 
Conference
4
0.45
References 
Authors
14
2
Name
Order
Citations
PageRank
Mathieu Serrurier126726.94
Henri Prade2105491445.02