Title | ||
---|---|---|
Learning Optimization For Decision Tree Classification Of Non-Categorical Data With Information Gain Impurity Criterion |
Abstract | ||
---|---|---|
We consider the problem of construction of decision trees in cases when data is non-categorical and is inherently high-dimensional. Using conventional tree growing algorithms that either rely on univariate splits or employ direct search methods for determining multivariate splitting conditions is computationally prohibitive. On the other hand application of standard optimization methods for finding locally optimal splitting conditions is obstructed by abundance of local minima and discontinuities of classical goodness functions such as e.g. information gain or Gini impurity. In order to avoid this limitation a method to generate smoothed replacement for measuring impurity of splits is proposed. This enables to use vast number of efficient optimization techniques for finding locally optimal splits and, at the same time, decreases the number of local minima. The approach is illustrated with examples. |
Year | Venue | DocType |
---|---|---|
2014 | PROCEEDINGS OF THE 2014 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN) | Conference |
ISSN | Citations | PageRank |
2161-4393 | 0 | 0.34 |
References | Authors | |
0 | 6 |
Name | Order | Citations | PageRank |
---|---|---|---|
K. I. Sofeikov | 1 | 7 | 0.83 |
Ivan Tyukin | 2 | 71 | 9.53 |
Alexander N Gorban | 3 | 90 | 16.13 |
Eugenij Moiseevich Mirkes | 4 | 11 | 3.08 |
Danil V. Prokhorov | 5 | 374 | 37.68 |
I. V. Romanenko | 6 | 0 | 0.34 |