Abstract | ||
---|---|---|
We introduce a multiple instance learning algorithm based on randomized decision trees. Our model extends an existing algorithm by Blockeel et al. [2] in several ways: 1) We learn a random forest instead of a single tree. 2) We construct the trees by splits based on non-linear boundaries on multiple features at a time. 3) We learn an optimal way of combining the decisions of multiple trees under the multiple instance constraints (i.e. positive bags have at least one positive instance, negative bags have only negative instances). Experiments on the typical benchmark data sets show that this model's prediction performance is clearly better than earlier tree based methods, and is comparable to the global state-of-the-art. |
Year | DOI | Venue |
---|---|---|
2014 | 10.1109/ICPR.2014.647 | ICPR |
Field | DocType | ISSN |
Decision tree,Data set,Instance-based learning,Pattern recognition,Computer science,Weight-balanced tree,Artificial intelligence,Random forest,Machine learning | Conference | 1051-4651 |
Citations | PageRank | References |
0 | 0.34 | 12 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Christoph N. Straehle | 1 | 127 | 7.57 |
Melih Kandemir | 2 | 182 | 16.91 |
Ullrich Köthe | 3 | 0 | 0.34 |
Fred A. Hamprecht | 4 | 962 | 76.24 |