Title
CT-UNet: Context-Transfer-UNet for Building Segmentation in Remote Sensing Images
Abstract
With the proliferation of remote sensing images, how to segment buildings more accurately in remote sensing images is a critical challenge. First, most networks have poor recognition ability on high resolution images, resulting in blurred boundaries in the segmented building maps. Second, the similarity between buildings and background results in intra-class inconsistency. To address these two problems, we propose an UNet-based network named Context-Transfer-UNet (CT-UNet). Specifically, we design Dense Boundary Block. Dense Block utilizes reuse mechanism to refine features and increase recognition capabilities. Boundary Block introduces the low-level spatial information to solve the fuzzy boundary problem. Then, to handle intra-class inconsistency, we construct Spatial Channel Attention Block. It combines context space information and selects more distinguishable features from space and channel. Finally, we propose an improved loss function to enhance the purpose of loss by adding evaluation indicator. Based on our proposed CT-UNet, we achieve 85.33% mean IoU on the Inria dataset, 91.00% mean IoU on the WHU dataset and 83.92% F1-score on the Massachusetts dataset. The results outperform our baseline (U-Net ResNet-34) by 3.76%, exceed Web-Net by 2.24% and surpass HFSA-Unet by 2.17%.
Year
DOI
Venue
2021
10.1007/s11063-021-10592-w
NEURAL PROCESSING LETTERS
Keywords
DocType
Volume
Remote sensing images, Building segmentation, U-Net, Context information, Attention models
Journal
53
Issue
ISSN
Citations 
6
1370-4621
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Sheng Liu158.58
Huanran Ye200.34
Kun Jin300.34
Haohao Cheng400.34