Abstract | ||
---|---|---|
In the context of Semantic Web, one of the most important issues related to the class-membership prediction task through inductive models on ontological knowledge bases concerns the imbalance of the training examples distribution, mostly due to the heterogeneous nature and the incompleteness of the knowledge bases. An ensemble learning approach has been proposed to cope with this problem. However, the majority voting procedure, exploited for deciding the membership, does not consider explicitly the uncertainty and the conflict among the classifiers of an ensemble model. Moving from this observation, we propose to integrate the Dempster-Shafer DS theory with ensemble learning. Specifically, we propose an algorithm for learning Evidential Terminological Random Forest models, an extension of Terminological Random Forests along with the DS theory. An empirical evaluation showed that: i the resulting models performs better for datasets with a lot of positive and negative examples and have a less conservative behavior than the voting-based forests; ii the new extension decreases the variance of the results. |
Year | DOI | Venue |
---|---|---|
2015 | 10.1007/978-3-319-18818-8_26 | Extended Semantic Web Conference |
Field | DocType | Volume |
Ontology,Data mining,Voting,Ensemble forecasting,Computer science,Semantic Web,Artificial intelligence,Majority rule,Random forest,Ensemble learning,Machine learning,Evidence-based practice | Conference | 9088 |
ISSN | Citations | PageRank |
0302-9743 | 3 | 0.37 |
References | Authors | |
12 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Giuseppe Rizzo | 1 | 349 | 37.75 |
Claudia D'Amato | 2 | 733 | 57.03 |
Nicola Fanizzi | 3 | 1124 | 90.54 |
Floriana Esposito | 4 | 2434 | 277.96 |