Title
Cosine Normalization: Using Cosine Similarity Instead of Dot Product in Neural Networks.
Abstract
Traditionally, multi-layer neural networks use dot product between the output vector of previous layer and the incoming weight vector as the input to activation function. The result of dot product is unbounded, thus increases the risk of large variance. Large variance of neuron makes the model sensitive to the change of input distribution, thus results in poor generalization, and aggravates the internal covariate shift which slows down the training. To bound dot product and decrease the variance, we propose to use cosine similarity or centered cosine similarity (Pearson Correlation Coefficient) instead of dot product in neural networks, which we call cosine normalization. We compare cosine normalization with batch, weight and layer normalization in fully-connected neural networks, convolutional networks on the data sets of MNIST, 20NEWS GROUP, CIFAR-10/100, SVHN. Experiments show that cosine normalization achieves better performance than other normalization techniques.
Year
Venue
DocType
2018
ICANN
Conference
Volume
Citations 
PageRank 
abs/1702.05870
4
0.43
References 
Authors
17
4
Name
Order
Citations
PageRank
Chunjie Luo143421.86
Jianfeng Zhan276762.86
Lei Wang357746.85
Qiang Yang417039875.69