Title
Target Detection Model Distillation Using Feature Transition and Label Registration for Remote Sensing Imagery
Abstract
Deep convolution networks have been widely used in remote sensing target detection for various applications in recent years. Target detection models with many parameters provide better results but are not suitable for resource-constrained devices due to their high computational cost and storage requirements. Furthermore, current lightweight target detection models for remote sensing imagery rarely have the advantages of existing models. Knowledge distillation can improve the learning ability of a small student network from a large teacher network due to acceleration and compression. However, current knowledge distillation methods typically use mature backbones as teacher and student networks are unsuitable for target detection in remote sensing imagery. In this article, we propose a target detection model distillation (TDMD) framework using feature transition and label registration for remote sensing imagery. A lightweight attention network is designed by ranking the importance of the convolutional feature layers in the teacher network. Multiscale feature transition based on a feature pyramid is utilized to constrain the feature maps of the student network. A label registration procedure is proposed to improve the TDMD model's learning ability of the output distribution of the teacher network. The proposed method is evaluated on the DOTA and NWPU VHR-10 remote sensing image datasets. The results show that the TDMD achieves a mean Average Precision (mAP) of 75.47% and 93.81% on the DOTA and NWPU VHR-10 datasets, respectively. Moreover, the model size is 43% smaller than that of the predecessor model (11.8 MB and 11.6 MB for the two datasets).
Year
DOI
Venue
2022
10.1109/JSTARS.2022.3188252
IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING
Keywords
DocType
Volume
Object detection, Knowledge engineering, Feature extraction, Remote sensing, Computational modeling, Computational efficiency, Training, Deep neural network, feature transition, label registration, model distillation, remote sensing, target detection
Journal
15
ISSN
Citations 
PageRank 
1939-1404
0
0.34
References 
Authors
0
5
Name
Order
Citations
PageRank
Boya Zhao100.34
Qing Wang234576.64
Yuanfeng Wu300.34
Qingqing Cao400.34
Qiong Ran5232.11