Title
Neural network transformation under hardware constraints.
Abstract
There are a number of mature ways to train various kinds of ANNs (artificial neural networks), including the BP (back propagation) based algorithm and so on. These training procedures are usually carried out on some GPU-enabled machine(s); 16-/32-bit-width floating point numbers are used as the NN parameters, without any limitation on the maximum fan-in/fan-out of a single neuron or on the type of activation functions. In contrast, for neuromorphic chips [1][2][3], quite a few hardware-specific constraints (the limited fan-in/fan-out of a single neuron, the limited range of synaptic weights, and the hardware types of neurons or activation functions are usually simpler than the software counterparts) do exist, which makes programming such chips difficult.
Year
DOI
Venue
2016
10.1145/2968455.2981122
CASES
Keywords
Field
DocType
hardware constraints,ANN,artificial neural network transformation,BP based algorithm,backpropagation based algorithm,floating point numbers,activation functions,neuromorphic chips,single neuron fan-in-fan-out,synaptic weights,GPU-enabled machines
Computer science,Floating point,Parallel computing,Neuromorphic engineering,Code generation,Error detection and correction,Software,Computer hardware,Artificial neural network,Winner-take-all,Backpropagation
Conference
ISBN
Citations 
PageRank 
978-1-5090-3589-2
1
0.34
References 
Authors
3
4
Name
Order
Citations
PageRank
Youhui Zhang120228.36
Yu Ji2162.24
Wenguang Chen3101470.57
Yuan Xie46430407.00