Title
Consistency of Probabilistic Classifier Trees.
Abstract
Label tree classifiers are commonly used for efficient multi-class and multi-label classification. They represent a predictive model in the form of a tree-like hierarchy of (internal) classifiers, each of which is trained on a simpler (often binary) subproblem, and predictions are made by (greedily) following these classifiers’ decisions from the root to a leaf of the tree. Unfortunately, this approach does normally not assure consistency for different losses on the original prediction task, even if the internal classifiers are consistent for their subtask. In this paper, we thoroughly analyze a class of methods referred to as probabilistic classifier trees (PCTs). Thanks to training probabilistic classifiers at internal nodes of the hierarchy, these methods allow for searching the tree-structure in a more sophisticated manner, thereby producing predictions of a less greedy nature. Our main result is a regret bound for 0/1 loss, which can easily be extended to ranking-based losses. In this regard, PCTs nicely complement a related approach called filter trees (FTs), and can indeed be seen as a natural alternative thereof. We compare the two approaches both theoretically and empirically.
Year
Venue
Field
2016
ECML/PKDD
Stochastic gradient descent,Pattern recognition,Ranking,Regret,Greedy algorithm,Code word,Artificial intelligence,Probabilistic logic,Hierarchy,Probabilistic classification,Machine learning,Mathematics
DocType
Citations 
PageRank 
Conference
2
0.35
References 
Authors
13
5
Name
Order
Citations
PageRank
Krzysztof Dembczynski1413.30
Wojciech Kotlowski215816.32
Willem Waegeman344729.82
Róbert Busa-Fekete4234.48
Eyke Hüllermeier53423213.52