Title
SaR: Self-adaptive Refinement on Pseudo Labels for Multiclass-Imbalanced Semi-supervised Learning
Abstract
Class-imbalanced datasets can severely deteriorate the performance of semi-supervised learning (SSL). This is due to the confirmation bias especially when the pseudo labels are highly biased towards the majority classes. Traditional resampling or reweighting techniques may not be directly applicable when the unlabeled data distribution is unknown. Inspired by the threshold-moving method that performs well in supervised learning-based binary classification tasks, we provide a simple yet effective scheme to address the multiclass imbalance issue of SSL. This scheme, named SaR, is a Self-adaptive Refinement of soft labels before generating pseudo labels. The pseudo labels generated post-SaR will be less biased, resulting in higher quality data for training the classifier. We show that SaR can consistently improve recent consistency-based SSL algorithms on various image classification problems across different imbalanced ratios. We also show that SaR is robust to the situations where unlabeled data have different distributions as labeled data. Hence, SaR does not rely on the assumptions that unlabeled data share the same distribution as the labeled data.
Year
DOI
Venue
2022
10.1109/CVPRW56347.2022.00454
IEEE Conference on Computer Vision and Pattern Recognition
DocType
Volume
Issue
Conference
2022
1
Citations 
PageRank 
References 
0
0.34
0
Authors
4
Name
Order
Citations
PageRank
Zhengfeng Lai100.34
Chao Wang200.34
Sen-Ching S. Cheung377670.97
Chen-Nee Chuah42006161.34