Title
Deep Learning with Data Dependent Implicit Activation Function.
Abstract
Though deep neural networks (DNNs) achieve remarkable performances in many artificial intelligence tasks, the lack of training instances remains a notorious challenge. As the network goes deeper, the generalization accuracy decays rapidly in the situation of lacking massive amounts of training data. In this paper, we propose novel deep neural network structures that can be inherited from all existing DNNs with almost the same level of complexity, and develop simple training algorithms. We show our paradigm successfully resolves the lack of data issue. Tests on the CIFAR10 and CIFAR100 image recognition datasets show that the new paradigm leads to 20$%$ to $30%$ relative error rate reduction compared to their base DNNs. The intuition of our algorithms for deep residual network stems from theories of the partial differential equation (PDE) control problems. Code will be made available.
Year
Venue
Field
2018
arXiv: Learning
Residual,Activation function,Data dependent,Intuition,Artificial intelligence,Deep learning,Artificial neural network,Partial differential equation,Machine learning,Approximation error,Mathematics
DocType
Volume
Citations 
Journal
abs/1802.00168
1
PageRank 
References 
Authors
0.35
15
6
Name
Order
Citations
PageRank
Bao Wang1112.55
Xiyang Luo2175.09
Zhen Li33312.70
Wei Zhu44812.13
Zuoqiang Shi512118.35
Stanley Osher67973514.62