Title
A novel minimum-size activation function and its derivative
Abstract
This brief presents two novel architectures, i.e., a nonlinear neural activation function and its derivative. Both are suitable for implementations of neurons in multilayer perceptron networks with an on-chip backpropagation learning algorithm. The activation function proposed shows minimal area and power consumption, can be considered as an approximation to the tanh(nx) function, and can be programmed to achieve any slope at the origin that is equal to or greater than 2. The derivative proposed also shows minimal area and maximizes its similarity, with the ideal derivative in the proximities of the origin being the best approximation for its degree of complexity. Both topologies are designed with subthreshold metal-oxide-semiconductor transistors in order to minimize power consumption. Likewise, they have been designed with balanced and fully differential topologies, so that external influences, offset, and distortion of even order are reduced. Moreover, a detailed analysis using the General Translinear Principle shows that the activation function is being affected by the body effect but the derivative function is immune to it. The activation function and the proposed derivative are thoroughly analyzed, and measured results are presented for our implementation on 0.5-µm AMI Semiconductor (AMIS) CMOS technology.
Year
DOI
Venue
2009
10.1109/TCSII.2009.2015398
IEEE Trans. on Circuits and Systems
Keywords
Field
DocType
shows minimal area,novel minimum-size activation function,derivative function,activation function,body effect,minimal area,best approximation,power consumption,cmos technology,nonlinear neural activation function,general translinear principle,nonlinear distortion,topology,immune system,derivative,ambient intelligence,backpropagation,learning artificial intelligence,low voltage,network on a chip,subthreshold,multilayer perceptron,chip
Nonlinear system,Control theory,Activation function,Derivative,Electronic engineering,Multilayer perceptron,Hyperbolic function,Backpropagation,Nonlinear distortion,Distortion,Mathematics
Journal
Volume
Issue
ISSN
56
4
1549-7747
Citations 
PageRank 
References 
4
0.51
3
Authors
2
Name
Order
Citations
PageRank
Manuel Carrasco-Robles1153.25
Luis Serrano2547.43