Title
Cycle and Self-Supervised Consistency Training for Adapting Semantic Segmentation of Aerial Images
Abstract
Semantic segmentation is a critical problem for many remote sensing (RS) image applications. Benefiting from large-scale pixel-level labeled data and the continuous evolution of deep neural network architectures, the performance of semantic segmentation approaches has been constantly improved. However, deploying a well-trained model on unseen and diverse testing environments remains a major challenge: a large gap between data distributions in train and test domains results in severe performance loss, while manual dense labeling is costly and not scalable. To this end, we proposed an unsupervised domain adaptation framework for RS image semantic segmentation that is both practical and effective. The framework is supported by the consistency principle, including the cycle consistency in the input space and self-supervised consistency in the training stage. Specifically, we introduce cycle-consistent generative adversarial networks to reduce the discrepancy between source and target distributions by translating one into the other. The translated source data then drive a pipeline of supervised semantic segmentation model training. We enforce consistency of model predictions across target image transformations in order to provide self-supervision for the unlabeled target data. Experiments and extensive ablation studies demonstrate the effectiveness of the proposed approach on two challenging benchmarks, on which we achieve up to 9.95% and 7.53% improvements in accuracy over the state-of-the-art methods, respectively.
Year
DOI
Venue
2022
10.3390/rs14071527
REMOTE SENSING
Keywords
DocType
Volume
unsupervised domain adaptation, semantic segmentation, self-supervision, remote sensing image
Journal
14
Issue
ISSN
Citations 
7
2072-4292
0
PageRank 
References 
Authors
0.34
0
6
Name
Order
Citations
PageRank
Han Gao100.68
Yang Zhao200.34
Peng Guo301.35
Zihao Sun400.34
Xiuwan Chen53318.04
Yunwei Tang600.34