Abstract | ||
---|---|---|
Shannon entropy used in standard top-down decision trees does not guarantee the best generalization. Split criteria based on generalized entropies offer different compromise between purity of nodes and overall information gain. Modified C4.5 decision trees based on Tsallis and Renyi entropies have been tested on several high-dimensional microarray datasets with interesting results. This approach may be used in any decision tree and information selection algorithm. |
Year | DOI | Venue |
---|---|---|
2008 | 10.1007/978-3-540-69731-2_62 | ICAISC |
Keywords | Field | DocType |
renyi entropy,generalized entropy,entropy,shannon entropy,information se- lection,information theory,modified c4,decision trees.,standard top-down decision tree,information selection algorithm,overall information gain,decision rules,different compromise,decision trees,decision tree,best generalization,decision rule,information gain,tsallis entropy,top down | Entropy power inequality,Pattern recognition,Rényi entropy,Information diagram,Tsallis entropy,Shannon's source coding theorem,Joint entropy,Artificial intelligence,Min entropy,Entropy (information theory),Machine learning,Mathematics | Conference |
Volume | ISSN | Citations |
5097 | 0302-9743 | 16 |
PageRank | References | Authors |
1.07 | 7 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
tomasz maszczyk | 1 | 42 | 5.29 |
Włodzisław Duch | 2 | 291 | 28.95 |