Title
Partial Multi-Label Learning with Meta Disambiguation
Abstract
ABSTRACTIn partial multi-label learning (PML) problems, each instance is partially annotated with a candidate label set, which consists of multiple relevant labels and some noisy labels. To solve PML problems, existing methods typically try to recover the ground-truth information from partial annotations based on extra assumptions on the data structures. While the assumptions hardly hold in real-world applications, the trained model may not generalize well to varied PML tasks. In this paper, we propose a novel approach for partial multi-label learning with meta disambiguation (PML-MD). Instead of relying on extra assumptions, we try to disambiguate between ground-truth and noisy labels in a meta-learning fashion. On one hand, the multi-label classifier is trained by minimizing a confidence-weighted ranking loss, which distinctively utilizes the supervised information according to the label quality; on the other hand, the confidence for each candidate label is adaptively estimated with its performance on a small validation set. To speed up the optimization, these two procedures are performed alternately with an online approximation strategy. Comprehensive experiments on multiple datasets and varied evaluation metrics validate the effectiveness of the proposed method.
Year
DOI
Venue
2021
10.1145/3447548.3467259
Knowledge Discovery and Data Mining
Keywords
DocType
Citations 
partial multi-label learning, candidate label set, disambiguation, ranking loss, meta learning
Conference
0
PageRank 
References 
Authors
0.34
0
3
Name
Order
Citations
PageRank
Ming-Kun Xie152.81
Feng Sun200.34
Sheng-Jun Huang347527.21