Title
When Unseen Domain Generalization is Unnecessary? Rethinking Data Augmentation.
Abstract
Recent advances in deep learning for medical image segmentation demonstrate expert-level accuracy. However, in clinically realistic environments, such methods have marginal performance due to differences in image domains, including different imaging protocols, device vendors and patient populations. Here we consider the problem of domain generalization, when a model is trained once, and its performance generalizes to unseen domains. Intuitively, within a specific medical imaging modality the domain differences are smaller relative to natural images domain variability. We rethink data augmentation for medical 3D images and propose a deep stacked transformations (DST) approach for domain generalization. Specifically, a series of n stacked transformations are applied to each image in each mini-batch during network training to account for the contribution of domain-specific shifts in medical images. We comprehensively evaluate our method on three tasks: segmentation of whole prostate from 3D MRI, left atrial from 3D MRI, and left ventricle from 3D ultrasound. We demonstrate that when trained on a small source dataset, (i) on average, DST models on unseen datasets degrade only by 11% (Dice score change), compared to the conventional augmentation (degrading 39%) and CycleGAN-based domain adaptation method (degrading 25%), (ii) when evaluation on the same domain, DST is also better albeit only marginally. % (iii) DST is comparable with training from scratch on a target domain when training with the same amount of data. (iii) When training on large-sized data, DST on unseen domains reaches performance of state-of-the-art fully supervised models. These findings establish a strong benchmark for the study of domain generalization in medical imaging, and can be generalized to the design of robust deep segmentation models for clinical deployment.
Year
Venue
DocType
2019
CoRR
Journal
Volume
Citations 
PageRank 
abs/1906.03347
0
0.34
References 
Authors
0
10
Name
Order
Citations
PageRank
Ling Zhang132.75
Xiaosong Wang200.68
Dong Yang34711.05
Thomas Sanford491.56
Stephanie Harmon5152.17
Baris Turkbey621223.00
Holger Roth773745.70
Andriy Myronenko811.70
Daguang Xu95014.28
Ziyue Xu1059735.50