Abstract | ||
---|---|---|
When the problem of learning from data is solved through a regression tree estimator, the quality of the available observations is an important issue, since it influences directly the accuracy of the resulting model. It becomes particuarly relevant when there is freedom to sample the input space arbitrarily to build the tree model or, alternatively, when we need to select a subsample to train the tree estimator on a computationally feasible input set, or to evaluate the goodness of the estimation on a test set. Here the accuracy of estimation based on regression trees is analyzed from the point of view of geometric properties of the available input data. In particular, the concept of F-discrepancy, a quantity that measures how well a set of points represents the distribution underlying the input generation process, is applied to derive conditions for convergence to the optimal piecewise-constant estimator for the unknown function we want to learn. The analysis has a constructive nature, allowing to select in practice good input sets for the problem at hand, as shown in a simulation example involving a real data set. |
Year | DOI | Venue |
---|---|---|
2014 | 10.1109/IJCNN.2014.6889665 | IJCNN |
Keywords | DocType | ISSN |
trees (mathematics),input generation process,learning (artificial intelligence),regression analysis,F-discrepancy concept,regression tree learning,sampling method,sampling methods,optimal piecewise-constant estimator,regression tree estimator | Conference | 2161-4393 |
ISBN | Citations | PageRank |
978-1-4799-6627-1 | 2 | 0.37 |
References | Authors | |
3 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Cristiano Cervellera | 1 | 226 | 23.63 |
Mauro Gaggero | 2 | 130 | 17.60 |
Danilo Macciò | 3 | 64 | 10.95 |