Title
Training neural networks with threshold activation functions and constrained integer weights
Abstract
Evolutionary neural network training algorithms are presented. These algorithms are applied to train neural networks with weight values confined to a narrow band of integers. We constrain the weights and biases in the range [-2k+1+1, 2k-1-1], for k=3, 4, 5, thus they can be represented by just k bits. Such neural networks are better suited for hardware implementation than the real weight ones. Mathematical operations that are easy to implement in software might often be very burdensome in the hardware and therefore more costly. Hardware-friendly algorithms are essential to ensure the functionality and cost effectiveness of the hardware implementation. To this end, in addition to the integer weights, the trained neural networks use threshold activation functions only, so hardware implementation is even easier. These algorithms have been designed keeping in mind that the resulting integer weights require less bits to be stored and the digital arithmetic operations between them are easier to be implemented in hardware. Obviously, if the network is trained in a constrained weight space, smaller weights are found and less memory is required. On the other hand, as we have found here, the network training procedure can be more effective and efficient when larger weights are allowed. Thus, for a given application a trade off between effectiveness and memory consumption has to be considered. Our intention is to present results of evolutionary algorithms on this difficult task. Based on the application of the proposed class of methods on classical neural network benchmarks, our experience is that these methods are effective and reliable
Year
DOI
Venue
2000
10.1109/IJCNN.2000.861451
IJCNN (5)
Keywords
Field
DocType
neural network,trained neural network,network training procedure,train neural networks,real weight,evolutionary computation,learning (artificial intelligence),classical neural network benchmarks,hardware implementation,integer weight,constrained integer weights,larger weight,threshold activation functions,smaller weight,training neural networks,evolutionary neural network training,neural nets,learning artificial intelligence,training data,artificial neural networks,neural networks,artificial intelligence,feedforward neural networks,mathematics,activation function
Feedforward neural network,Random neural network,Computer science,Stochastic neural network,Recurrent neural network,Time delay neural network,Artificial intelligence,Deep learning,Winner-take-all,Machine learning,Catastrophic interference
Conference
Volume
ISSN
ISBN
5
1098-7576
0-7695-0619-4
Citations 
PageRank 
References 
17
2.36
2
Authors
2
Name
Order
Citations
PageRank
Plagianakos, V.P.117313.01
Michael N. Vrahatis217913.96