Abstract | ||
---|---|---|
Most of the methods that generate decision trees for a specific problem use the examples of data instances in the decision tree-generation process. This article proposes a method called RBDT-1-rule-based decision tree-for learning a decision tree from a set of decision rules that cover the data instances rather than from the data instances themselves. The goal is to create on demand a short and accurate decision tree from a stable or dynamically changing set of rules. The rules could be generated by an expert, by an inductive rule learning program that induces decision rules from the examples of decision instances such as AQ-type rule induction programs, or extracted from a tree generated by another method, such as the ID3 or C4.5. In terms of tree complexity (number of nodes and leaves in the decision tree), RBDT-1 compares favorably with AQDT-1 and AQDT-2, which are methods that create decision trees from rules. RBDT-1 also compares favorably with ID3 while it is as effective as C4.5 where both (ID3 and C4.5) are well-known methods that generate decision trees from data examples. Experiments show that the classification accuracies of the decision trees produced by all methods under comparison are indistinguishable. |
Year | DOI | Venue |
---|---|---|
2016 | 10.1111/coin.12049 | COMPUTATIONAL INTELLIGENCE |
Keywords | Field | DocType |
attribute selection criteria,decision rules,data-based decision tree,rule-based decision tree,tree complexity | Decision tree,Data mining,Grafting (decision trees),Computer science,Artificial intelligence,ID3 algorithm,Alternating decision tree,Decision stump,Decision rule,Pattern recognition,Decision tree learning,Machine learning,Incremental decision tree | Journal |
Volume | Issue | ISSN |
32.0 | 2.0 | 0824-7935 |
Citations | PageRank | References |
0 | 0.34 | 7 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Amany Abdelhalim | 1 | 8 | 2.62 |
Issa Traore | 2 | 306 | 32.31 |
Youssef Nakkabi | 3 | 14 | 1.82 |