Title
Look-Up Table Unit Activation Function For Deep Convolutional Neural Networks
Abstract
Activation functions provide deep neural networks the non-linearity that is necessary to learn complex distributions. It is still inconclusive what is the optimal shape for the activation function. In this work, we introduce a novel type of activation function of which the shape is learned with network training. The proposed Look-up Table Unit (LuTU) stores a set of anchor points in a look-up table like structure, and the activation function is generated from the anchor points by either linear interpolation or smoothing with a single period cosine mask function. LuTU is in theory able to approximate any univariate function. By observing the learned shapes of LuTU, we further propose a Mixture of Gaussian Unit (MoGU) that can learn similar non-linear shapes with much fewer parameters. Finally, we use a multiple activation function fusion framework that combines multiple types of functions to achieve better performance. The inference complexity of multiple activation function fusion is constant with linear interpolation approximation. Our experiments on a synthetic dataset, ImageNet, and CIFAR-10 demonstrate that the proposed method outperforms traditional ReLU family activation functions. On the ImageNet dataset, our method achieves 1.47% and 1.0% higher accuracy on ResNet-18 and ResNet-34 models, respectively. With the proposed activation function, we can design a network that has the same performance as ResNet34 but 8 fewer convolutional layers.
Year
DOI
Venue
2018
10.1109/WACV.2018.00139
2018 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2018)
Field
DocType
ISSN
Kernel (linear algebra),Lookup table,Pattern recognition,Convolutional neural network,Computer science,Activation function,Interpolation,Smoothing,Artificial intelligence,Linear interpolation,Artificial neural network
Conference
2472-6737
Citations 
PageRank 
References 
0
0.34
0
Authors
3
Name
Order
Citations
PageRank
Min Wang116936.41
Baoyuan Liu21325.64
Hassan Foroosh374859.98