Abstract | ||
---|---|---|
Different real-world applications of multi-label classification often demand different evaluation criteria. We formalize this demand with a general setup, cost-sensitive multi-label classification (CSMLC), which takes the evaluation criteria into account during learning. Nevertheless, most existing algorithms can only focus on optimizing a few specific evaluation criteria, and cannot systematically deal with different ones. In this paper, we propose a novel algorithm, called condensed filter tree (CFT), for optimizing any criteria in CSMLC. CFT is derived from reducing CSMLC to the famous filter tree algorithm for cost-sensitive multi-class classification via constructing the label powerset. We successfully cope with the difficulty of having exponentially many extended-classes within the powerset for representation, training and prediction by carefully designing the tree structure and focusing on the key nodes. Experimental results across many real-world datasets validate that CFT is competitive with special purpose algorithms on special criteria and reaches better performance on general criteria. |
Year | Venue | Field |
---|---|---|
2014 | ICML | Data mining,Computer science,Multi-label classification,Artificial intelligence,Tree structure,Machine learning |
DocType | Citations | PageRank |
Conference | 12 | 0.58 |
References | Authors | |
17 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Chun-Liang Li | 1 | 152 | 16.48 |
Hsuan-Tien Lin | 2 | 829 | 74.77 |