Title
Relational Subsets Knowledge Distillation for Long-Tailed Retinal Diseases Recognition
Abstract
In the real world, medical datasets often exhibit a long-tailed data distribution (i.e., a few classes occupy most of the data, while most classes have rarely few samples), which results in a challenging imbalance learning scenario. For example, there are estimated more than 40 different kinds of retinal diseases with variable morbidity, however with more than 30+ conditions are very rare from the global patient cohorts, which results in a typical long-tailed learning problem for deep learning-based screening models. In this study, we propose class subset learning by dividing the long-tailed data into multiple class subsets according to prior knowledge, such as regions and phenotype information. It enforces the model to focus on learning the subset-specific knowledge. More specifically, there are some relational classes that reside in the fixed retinal regions, or some common pathological features are observed in both the majority and minority conditions. With those subsets learnt teacher models, then we are able to distil the multiple teacher models into a unified model with weighted knowledge distillation loss. The proposed framework proved to be effective for the long-tailed retinal diseases recognition task. The experimental results on two different datasets demonstrate that our method is flexible and can be easily plugged into many other state-of-the-art techniques with significant improvements.
Year
DOI
Venue
2021
10.1007/978-3-030-87237-3_1
MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION - MICCAI 2021, PT VIII
Keywords
DocType
Volume
Retinal diseases recognition, Long-tailed learning, Knowledge distillation, Deep learning
Conference
12908
ISSN
Citations 
PageRank 
0302-9743
0
0.34
References 
Authors
0
8
Name
Order
Citations
PageRank
Lie Ju120.70
Xin Wang201.35
Lin Wang322.41
Tongliang Liu422.42
Xin Zhao522.07
Tom Drummond62676159.45
Dwarikanath Mahapatra700.34
zongyuan ge814927.83