Title
Negative Log Likelihood Ratio Loss for Deep Neural Network Classification.
Abstract
In deep neural network, the cross-entropy loss function is commonly used for classification. Minimizing cross-entropy is equivalent to maximizing likelihood under assumptions of uniform feature and class distributions. It belongs to generative training criteria which does not directly discriminate correct class from competing classes. We propose a discriminative loss function with negative log likelihood ratio between correct and competing classes. It significantly outperforms the cross-entropy loss on the CIFAR-10 image classification task.
Year
Venue
Field
2018
arXiv: Learning
Neural network classification,Artificial intelligence,Generative grammar,Contextual image classification,Artificial neural network,Discriminative model,Mathematics,Negative log likelihood,Machine learning
DocType
Volume
Citations 
Journal
abs/1804.10690
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Donglai Zhu100.34
Heng-shuai Yao22610.03
Bei Jiang372.84
Peng Yu42720.15