Title
Effects of weight discretization on the back propagation learning method: algorithm design and hardware realization
Abstract
An architectural configuration for the back-propagation (BP) algorithm is illustrated. The circuit solution for the basic blocks is presented, and the effect of weight discretization on the BP algorithm is analyzed. It is demonstrated, through simulations, how the BP algorithm can be operated successfully with discretized weights. In particular, better performances can be achieved with an exponential discretization, i.e. the strength of weights varies exponentially with the controlling variable (voltage). The discretized voltage values differ by a quantity high enough that the neural network can be backed up with a refresh technique in combination with a multilevel dynamic memory that entails a particularly low wiring cost. A quasi-analog adaptive architecture is devised, properly matching the BP algorithm, and its CMOS circuit implementation is detailed. The mechanism controlling weight changes is simple enough to be reproduced locally at each synapsis site, thus meeting one of the requirements for an efficient storage technology for analog VLSI
Year
DOI
Venue
1990
10.1109/IJCNN.1990.137958
IJCNN
Keywords
Field
DocType
cmos integrated circuits,vlsi,learning systems,neural nets,parallel architectures,cmos circuit implementation,algorithm design,architectural configuration,hardware realization,multilevel dynamic memory,neural network,propagation learning method,refresh technique,weight discretization,back propagation
Adaptive architecture,Discretization,Algorithm design,Computer science,CMOS,Artificial intelligence,Backpropagation,Artificial neural network,Integrated circuit,Very-large-scale integration,Machine learning
Conference
Citations 
PageRank 
References 
5
0.55
1
Authors
3
Name
Order
Citations
PageRank
Caviglia, D.D.1224.64
M. Valle29719.19
Bisio, G.M.3224.24