Title
Deep Neural Nets with Interpolating Function as Output Activation.
Abstract
We replace the output layer of deep neural nets, typically the softmax function, by a novel interpolating function. And we propose end-to-end training and testing algorithms for this new architecture. Compared to classical neural nets with softmax function as output activation, the surrogate with interpolating function as output activation combines advantages of both deep and manifold learning. The new framework demonstrates the following major advantages: First, it is better applicable to the case with insufficient training data. Second, it significantly improves the generalization accuracy on a wide variety of networks. The algorithm is implemented in PyTorch, and the code is available at https://github.com/BaoWangMath/DNN-DataDependentActivation.
Year
Venue
Keywords
2018
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018)
constant factors,the framework,subjective logic,manifold learning,softmax function
Field
DocType
Volume
Training set,Softmax function,Computer science,Interpolation,Artificial intelligence,Nonlinear dimensionality reduction,Artificial neural network,Machine learning
Conference
31
ISSN
Citations 
PageRank 
1049-5258
3
0.39
References 
Authors
0
6
Name
Order
Citations
PageRank
bao wang1358.19
Xiyang Luo2175.09
Zhen Li339790.65
Wei Zhu46310.82
Zuoqiang Shi512118.35
Stanley Osher67973514.62