Title | ||
---|---|---|
SaR: Self-adaptive Refinement on Pseudo Labels for Multiclass-Imbalanced Semi-supervised Learning |
Abstract | ||
---|---|---|
Class-imbalanced datasets can severely deteriorate the performance of semi-supervised learning (SSL). This is due to the confirmation bias especially when the pseudo labels are highly biased towards the majority classes. Traditional resampling or reweighting techniques may not be directly applicable when the unlabeled data distribution is unknown. Inspired by the threshold-moving method that performs well in supervised learning-based binary classification tasks, we provide a simple yet effective scheme to address the multiclass imbalance issue of SSL. This scheme, named SaR, is a Self-adaptive Refinement of soft labels before generating pseudo labels. The pseudo labels generated post-SaR will be less biased, resulting in higher quality data for training the classifier. We show that SaR can consistently improve recent consistency-based SSL algorithms on various image classification problems across different imbalanced ratios. We also show that SaR is robust to the situations where unlabeled data have different distributions as labeled data. Hence, SaR does not rely on the assumptions that unlabeled data share the same distribution as the labeled data. |
Year | DOI | Venue |
---|---|---|
2022 | 10.1109/CVPRW56347.2022.00454 | IEEE Conference on Computer Vision and Pattern Recognition |
DocType | Volume | Issue |
Conference | 2022 | 1 |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Zhengfeng Lai | 1 | 0 | 0.34 |
Chao Wang | 2 | 0 | 0.34 |
Sen-Ching S. Cheung | 3 | 776 | 70.97 |
Chen-Nee Chuah | 4 | 2006 | 161.34 |