Title
Class-Aware Contrastive Semi-Supervised Learning
Abstract
Pseudo-label-based semi-supervised learning (SSL) has achieved great success on raw data utilization. However, its training procedure suffers from confirmation bias due to the noise contained in self-generated artificial labels. Moreover, the model's judgment becomes noisier in real-world applications with extensive out-of-distribution data. To address this issue, we propose a general method named Class-aware Contrastive Semi-Supervised Learning (CCSSL), which is a drop-in helper to improve the pseudo-label quality and enhance the model's robustness in the real-world setting. Rather than treating real-world data as a union set, our method separately handles reliable in-distribution data with class-wise clustering for blending into downstream tasks and noisy out-of-distribution data with image-wise contrastive for better generalization. Furthermore, by applying target reweighting, we successfully emphasize clean label learning and simultaneously reduce noisy label learning. Despite its simplicity, our proposed CCSSL has significant performance improvements over the state-of-the-art SSL methods on the standard datasets CIFAR100 [18] and STL10 [8]. On the real-world dataset Semi-iNat 2021 [27], we improve FixMatch [25] by 9.80% and CoMatch [19] by 3.18%. Code is available https://github.com/TencentYoutuResearch/Classification-SemiCLS.
Year
DOI
Venue
2022
10.1109/CVPR52688.2022.01402
IEEE Conference on Computer Vision and Pattern Recognition
Keywords
DocType
Volume
Self-& semi-& meta- & unsupervised learning
Conference
2022
Issue
Citations 
PageRank 
1
0
0.34
References 
Authors
0
9
Name
Order
Citations
PageRank
Fan Yang111.38
Kai Wu200.34
Shuyi Zhang300.34
Guannan Jiang400.34
Yong Liu521345.82
Feng Zheng636931.93
Wei Zhang700.34
Chengjie Wang84319.03
Long Zeng900.34