Title
Swin Transformer V2: Scaling Up Capacity and Resolution
Abstract
We present techniques for scaling Swin Transformer [35] up to 3 billion parameters and making it capable of training with images of up to 1,536x1,536 resolution. By scaling up capacity and resolution, Swin Transformer sets new records on four representative vision benchmarks: 84.0% top-1 accuracy on ImageNet- V2 image classification, 63.1 / 54.4 box / mask mAP on COCO object detection, 59.9 mIoU on ADE20K semantic segmentation, and 86.8% top-1 accuracy on Kinetics-400 video action classification. We tackle issues of training instability, and study how to effectively transfer models pre-trained at low resolutions to higher resolution ones. To this aim, several novel technologies are proposed: 1) a residual post normalization technique and a scaled cosine attention approach to improve the stability of large vision models; 2) a log-spaced continuous position bias technique to effectively transfer models pre-trained at low-resolution images and windows to their higher-resolution counterparts. In addition, we share our crucial implementation details that lead to significant savings of GPU memory consumption and thus make it feasi-ble to train large vision models with regular GPUs. Using these techniques and self-supervised pre-training, we suc-cessfully train a strong 3 billion Swin Transformer model and effectively transfer it to various vision tasks involving high-resolution images or windows, achieving the state-of-the-art accuracy on a variety of benchmarks. Code is avail-able at https://github.com/microsoft/Swin-Transformer.
Year
DOI
Venue
2022
10.1109/CVPR52688.2022.01170
IEEE Conference on Computer Vision and Pattern Recognition
Keywords
DocType
Volume
Deep learning architectures and techniques, Representation learning
Conference
2022
Issue
Citations 
PageRank 
1
0
0.34
References 
Authors
0
12
Name
Order
Citations
PageRank
Ze Liu102.03
Han Hu219314.98
Lin, Yutong302.37
Zhuliang Yao401.35
Xie, Zhenda501.35
Yixuan Wei601.01
Jia Ning700.68
Yue Cao857421.49
Zheng Zhang926729.51
Li Dong1058231.86
Furu Wei111956107.57
Baining Guo123970194.91