Title
Enhancing Batch Normalized Convolutional Networks using Displaced Rectifier Linear Units: A Systematic Comparative Study
Abstract
•Enhanced nonlinearities may improve expert systems performance.•Proposal of the activation function DReLU.•DReLU presents the best training speed in all cases.•DReLU enhances the ReLU performance in all scenarios.•DReLU provides the best test accuracy in almost all experiments.
Year
DOI
Venue
2019
10.1016/j.eswa.2019.01.066
Expert Systems with Applications
Keywords
Field
DocType
DReLU,Activation function,Batch normalization,Comparative study,Convolutional Neural Networks,Deep learning
Identity function,Residual,Rectifier,Normalization (statistics),Convolutional neural network,Activation function,Computer science,Artificial intelligence,Deep learning,Machine learning,Statistical hypothesis testing
Journal
Volume
ISSN
Citations 
124
0957-4174
1
PageRank 
References 
Authors
0.35
0
4
Name
Order
Citations
PageRank
David Macêdo173.16
Cleber Zanchettin212721.14
Adriano L. I. Oliveira336436.36
Teresa Bernarda Ludermir4928108.14