Title
CondConv: Conditionally Parameterized Convolutions for Efficient Inference
Abstract
Convolutional layers are one of the basic building blocks of modern deep neural networks. One fundamental assumption is that convolutional kernels should be shared for all examples in a dataset. We propose conditionally parameterized convolutions (CondConv), which learn specialized convolutional kernels for each example. Replacing normal convolutions with CondConv enables us to increase the size and capacity of a network, while maintaining efficient inference. We demonstrate that scaling networks with CondConv improves the performance and inference cost trade-off of several existing convolutional neural network architectures on both classification and detection tasks. On ImageNet classification, our CondConv approach applied to EfficientNet-B0 achieves state-of-the-art performance of 78.3% accuracy with only 413M multiply-adds. Code and checkpoints for the CondConv Tensorflow layer and CondConv-EfficientNet models are available at: https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/condconv.
Year
Venue
Field
2019
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019)
Parameterized complexity,Mathematical optimization,Inference,Convolution,Computer science,Theoretical computer science
DocType
Volume
ISSN
Conference
32
1049-5258
Citations 
PageRank 
References 
1
0.35
0
Authors
4
Name
Order
Citations
PageRank
Brandon Yang110.69
Gabriel Bender21578.35
Quoc V. Le38501366.59
Jiquan Ngiam429719.56