Title
Learning Efficient and Accurate Detectors With Dynamic Knowledge Distillation in Remote Sensing Imagery
Abstract
Deep convolutional neural networks (CNNs) have brought a tremendous increase in detection accuracy, but too cumbersome model makes them hard to deploy on low computation edge devices, such as satellites and unmanned aerial vehicles. A promising method to tackle this problem is knowledge distillation (KD), which makes models lightweight with satisfactory accuracy. For remote sensing images, the objects are usually environment-related and located in a cluttered scene. The features that objects' semantic information relies on are tangled. However, existing distillation methods only imitate feature distribution derived from regions, including objects resulting in poor performance. Furthermore, masses of instances generated by teachers are blindly inherited, even if some of them are outliers. In this article, we propose a general and effective KD framework called dynamic knowledge distillation (DKD). First, our framework leverages the dynamic global distillation (GD) module to discover valuable regions from the foreground and background for multiscale features imitation, avoiding ignoring the potential geographical spatial relationship. Second, we propose a dynamic instance selection distillation (ISD) module to give students the ability of self-judgment through the magnitude of detection loss. Third, toward more accurate handling of hard samples in regression, a training-status-aware loss is tailored to guide students mine knowledge about objects with large aspect ratio or small size. Extensive experiments are conducted to show the effectiveness of DKD framework. The detection results on DOTA and NWPU VHR-10 dataset illustrate that our method is suitable for single-stage, two-stage and even anchor-free detectors. It shows the state-of-the-art performance. The code will be publicly available.
Year
DOI
Venue
2022
10.1109/TGRS.2021.3130443
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING
Keywords
DocType
Volume
Context information, instance selection, knowledge distillation (KD), object detection, training-status-aware weighted
Journal
60
ISSN
Citations 
PageRank 
0196-2892
0
0.34
References 
Authors
0
6
Name
Order
Citations
PageRank
Yidan Zhang100.68
Zhiyuan Yan201.35
Xian Sun308.45
Wenhui Diao404.73
Kun Fu541457.81
Lei Wang66554.21