Abstract | ||
---|---|---|
This article presents a procedure for improving generalization in classification trees. This procedure consists of adjusting the nodes of a tree with the aim of moving the borders between regions in the problem space away from the neighboring training samples. The objective of this process is to place the borders in an optimal position inside the problem space and to detect some noisy training samples. In this way, results on training samples are worse (noisy training samples are not successfully classified) and results on validation samples are better. These facts are experimentally confirmed with several examples. |
Year | DOI | Venue |
---|---|---|
2002 | 10.1016/S0925-2312(01)00670-1 | Neurocomputing |
Keywords | Field | DocType |
Classification tree,Generalization,Perceptron,Neural Tree | Pattern recognition,Artificial intelligence,Perceptron,Decision tree learning,Problem space,Mathematics,Machine learning | Journal |
Volume | Issue | ISSN |
48 | 1 | 0925-2312 |
Citations | PageRank | References |
2 | 0.37 | 5 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
c j mantas | 1 | 2 | 0.37 |
j m mantas ruiz | 2 | 2 | 0.37 |
Fernando Rojas Ruiz | 3 | 34 | 7.37 |