Abstract | ||
---|---|---|
In classification problems with ordinal monotonic constraints, the class variable should raise in accordance with a subset of explanatory variables. Models generated by standard classifiers do not guarantee to fulfill these monotonicity constraints. Therefore, some algorithms have been designed to deal with these problems. In the particular case of the decision trees, the growing and pruning mechanisms have been modified in order to produce monotonic trees. Recently, also ensembles have been adapted toward this problem, providing a good trade-off between accuracy and monotonicity degree. In this paper we study the behaviour of these decision tree mechanisms built on an AdaBoost scheme. We combine these techniques with a simple ensemble pruning method based on the degree of monotonicity. After an exhaustive experimental analysis, we deduce that the AdaBoost achieves a better predictive performance than standard algorithms, while holding also the monotonicity restriction. |
Year | DOI | Venue |
---|---|---|
2016 | 10.1007/978-3-319-32034-2_43 | HYBRID ARTIFICIAL INTELLIGENT SYSTEMS |
Keywords | Field | DocType |
Monotonic classification, Decision tree induction, AdaBoost, Ensemble pruning | Monotonic function,Decision tree,Standard algorithms,AdaBoost,Pattern recognition,Computer science,Ordinal number,Artificial intelligence,Class variable,Machine learning,Pruning | Conference |
Volume | ISSN | Citations |
9648 | 0302-9743 | 2 |
PageRank | References | Authors |
0.36 | 13 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Sergio González | 1 | 26 | 2.68 |
Francisco Herrera | 2 | 27391 | 1168.49 |
S. G. Garcia | 3 | 569 | 24.88 |