Title
SSN: Learning Sparse Switchable Normalization via SparsestMax
Abstract
Normalization method deals with parameters training of convolution neural networks (CNNs) in which there are often multiple convolution layers. Despite the fact that layers in CNN are not homogeneous in the role they play at representing a prediction function, existing works often employ identical normalizer in different layers, making performance away from idealism. To tackle this problem and further boost performance, a recently-proposed switchable normalization (SN) provides a new perspective for deep learning: it learns to select different normalizers for different convolution layers of a ConvNet. However, SN uses softmax function to learn importance ratios to combine normalizers, not only leading to redundant computations compared to a single normalizer but also making model less interpretable. This work addresses this issue by presenting sparse switchable normalization (SSN) where the importance ratios are constrained to be sparse. Unlike \(\ell _1\) and \(\ell _0\) regularizations that impose difficulties in tuning layer-wise regularization coefficients, we turn this sparse-constrained optimization problem into feed-forward computation by proposing SparsestMax, which is a sparse version of softmax. SSN has several appealing properties. (1) It inherits all benefits from SN such as applicability in various tasks and robustness to a wide range of batch sizes. (2) It is guaranteed to select only one normalizer for each normalization layer, avoiding redundant computations and improving interpretability of normalizer selection. (3) SSN can be transferred to various tasks in an end-to-end manner. Extensive experiments show that SSN outperforms its counterparts on various challenging benchmarks such as ImageNet, COCO, Cityscapes, ADE20K, Kinetics and MegaFace. Models and code are available at https://github.com/switchablenorms/Sparse_SwitchNorm.
Year
DOI
Venue
2020
10.1007/s11263-019-01269-y
International Journal of Computer Vision
Keywords
DocType
Volume
Deep learning,Normalization,Classification,Optimization
Journal
128
Issue
ISSN
Citations 
8
0920-5691
2
PageRank 
References 
Authors
0.40
0
6
Name
Order
Citations
PageRank
Wenqi Shao1104.63
Jingyu Li2101.93
Jiamin Ren3122.25
Ruimao Zhang432518.86
Xiaogang Wang59647386.70
Ping Luo6154.32