Title
Exploring Optimal Adaptive Activation Functions for Various Tasks.
Abstract
An activation function is a key component of artificial neural networks (ANNs). It has a great impact on the performance and convergence of neural networks. In this work, a self-adapting methodology is proposed to explore the optimal adaptive activation functions for various tasks based on S-shaped or ReLu-shaped activation functions, which are regulated only by introducing several parameters. To verify the effectiveness of the proposed methodology, a series of comparison experiments are performed with MLP, CNN and RNN network structure on the benchmark datasets of image, text and audio. The experimental results are encouraging, and show that the proposed methodology can locate the optimal activation functions for various tasks. Nevertheless, the obtained functions are competitive and the improvements on network performance are significant compared with other popular activation functions, such as ELU, PReLU, ReLU, and Sigmoid.
Year
DOI
Venue
2020
10.1109/BIBM49941.2020.9313386
BIBM
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
6
Name
Order
Citations
PageRank
Aizhu Liu100.34
Haigen Hu200.34
Tian Qiu300.34
Qianwei Zhou400.34
Qiu Guan583.49
Xiao-Xin Li682.80