Title
Hierarchical contrastive adaptation for cross-domain object detection
Abstract
Object detection based on deep learning has been enormously developed in recent years. However, applying the detectors trained on a label-rich domain to an unseen domain results in performance drop due to the domain-shift. To deal with this problem, we propose a novel unsupervised domain adaptation method to adapt from a labeled source domain to an unlabeled target domain. Recent approaches based on adversarial learning show some effect for aligning the feature distributions of different domains, but the decision boundary would be strongly source-biased for the complex detection task when merely training with source labels and aligning in the entire feature distribution. In this paper, we suggest utilizing image translation to generate translated images of source and target domains to fill in the large domain gap and facilitate a paired adaptation. We propose a hierarchical contrastive adaptation method between the original and translated domains to encourage the detectors to learn domain-invariant but discriminative features. To attach importance to foreground instances and tackle the noises of translated images, we further propose foreground attention reweighting for instance-aware adaptation . Experiments are carried out on 3 cross-domain detection scenarios, and we achieve the state-of-the-art results against other approaches, showing the effectiveness of our proposed method.
Year
DOI
Venue
2022
10.1007/s00138-022-01317-7
Machine Vision and Applications
Keywords
DocType
Volume
Unsupervised domain adaptation, Object detection, Transfer learning
Journal
33
Issue
ISSN
Citations 
4
0932-8092
0
PageRank 
References 
Authors
0.34
3
4
Name
Order
Citations
PageRank
Ziwei Deng100.68
Quan Kong200.68
Naoto Akira300.34
Tomoaki Yoshinaga401.01