Title
AlphaMEX: A smarter global pooling method for convolutional neural networks.
Abstract
Deep convolutional neural networks have achieved great success on image classification. A series of feature extractors learned from CNN have been used in many computer vision tasks. Global pooling layer plays a very important role in deep convolutional neural networks. It is found that the input feature-maps of global pooling become sparse, as the increasing use of Batch Normalization and ReLU layer combination, which makes the original global pooling low efficiency. In this paper, we proposed a novel end-to-end trainable global pooling operator AlphaMEX Global Pool for convolutional neural network. A nonlinear smooth log-mean-exp function is designed, called AlphaMEX, to extract features effectively and make networks smarter. Compared to the original global pooling layer, our proposed method can improve classification accuracy without increasing any layers or too much redundant parameters. Experimental results on CIFAR-10/CIFAR100, SVHN and ImageNet demonstrate the effectiveness of the proposed method. The AlphaMEX-ResNet outperforms original ResNet-110 by 8.3% on CIFAR10+, and the top-1 error rate of AlphaMEX-DenseNet (k = 12) reaches 5.03% which outperforms original DenseNet (k = 12) by 4.0%.
Year
DOI
Venue
2018
10.1016/j.neucom.2018.07.079
Neurocomputing
Keywords
Field
DocType
CNN,Global Pooling,Feature-map sparsity,AlphaMEX,Network compression
Normalization (statistics),Nonlinear system,Pattern recognition,Convolutional neural network,Pooling,Word error rate,Operator (computer programming),Artificial intelligence,Contextual image classification,Machine learning,Mathematics
Journal
Volume
ISSN
Citations 
321
0925-2312
3
PageRank 
References 
Authors
0.39
43
4
Name
Order
Citations
PageRank
Boxue Zhang162.28
Qi Zhao2209.69
Wenquan Feng393.58
Shuchang Lyu452.14