Title
Adaptive Contrastive Learning with Label Consistency for Source Data Free Unsupervised Domain Adaptation
Abstract
Unsupervised domain adaptation, which aims to alleviate the domain shift between source domain and target domain, has attracted extensive research interest; however, this is unlikely in practical application scenarios, which may be due to privacy issues and intellectual rights. In this paper, we discuss a more challenging and practical source-free unsupervised domain adaptation, which needs to adapt the source domain model to the target domain without the aid of source domain data. We propose label consistent contrastive learning (LCCL), an adaptive contrastive learning framework for source-free unsupervised domain adaptation, which encourages target domain samples to learn class-level discriminative features. Considering that the data in the source domain are unavailable, we introduce the memory bank to store the samples with the same pseudo label output and the samples obtained by clustering, and the trusted historical samples are involved in contrastive learning. In addition, we demonstrate that LCCL is a general framework that can be applied to unsupervised domain adaptation. Extensive experiments on digit recognition and image classification benchmark datasets demonstrate the effectiveness of the proposed method.
Year
DOI
Venue
2022
10.3390/s22114238
SENSORS
Keywords
DocType
Volume
unsupervised domain adaptation, contrastive learning, source free domain adaptation
Journal
22
Issue
ISSN
Citations 
11
1424-8220
0
PageRank 
References 
Authors
0.34
0
7
Name
Order
Citations
PageRank
Xuejun Zhao100.34
Rafal Stanislawski24611.53
Paolo Gardoni302.03
Maciej Sulowicz400.68
Adam Glowacz544.90
Grzegorz Krolczyk600.34
Li, Z.774.55