Title
On Robustness and Transferability of Convolutional Neural Networks
Abstract
Modern deep convolutional networks (CNNs) are often criticized for not generalizing under distributional shifts. However, several recent breakthroughs in transfer learning suggest that these networks can cope with severe distribution shifts and successfully adapt to new tasks from a few training examples. In this work we study the interplay between out-of-distribution and transfer performance of modern image classification CNNs for the first time and investigate the impact of the pre-training data size, the model scale, and the data preprocessing pipeline. We find that increasing both the training set and model sizes significantly improve the distributional shift robustness. Furthermore, we show that, perhaps surprisingly, simple changes in the preprocessing such as modifying the image resolution can significantly mitigate robustness issues in some cases. Finally, we outline the shortcomings of existing robustness evaluation datasets and introduce a synthetic dataset SI-SCORE we use for a systematic analysis across factors of variation common in visual data such as object size and position.
Year
DOI
Venue
2021
10.1109/CVPR46437.2021.01619
2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021
DocType
ISSN
Citations 
Conference
1063-6919
0
PageRank 
References 
Authors
0.34
0
15
Name
Order
Citations
PageRank
Djolonga, Josip1687.87
Jessica Yung200.34
Michael Tschannen314313.58
Rob Romijnders400.34
Lucas Beyer523213.50
Alexander Kolesnikov615211.94
Joan Puigcerver712.08
Minderer, Matthias821.37
Alexander D'Amour900.34
Dan Moldovan1000.34
Sylvain Gelly1176059.74
Neil Houlsby1215314.73
Xiaohua Zhai1320913.00
Mario Lucic1423116.10
Gelly Sylvain1500.34