Title
Stable and compact design of Memristive GoogLeNet Neural Network
Abstract
According to the requirements of edge intelligence for circuit volume, power consumption and computing performance, a Memristive GoogLeNet Neural Network (MGNN) circuit is designed using memristor which is a new device integrating storage and computing as the basic circuit element. This circuit adopts 1×1 convolution and multi-scale convolution feature fusion to reduce the number of layers required by the network while ensuring the recognition accuracy of circuit. In order to reduce the size of the memristor crossbars in the circuit, we design word-line pruning and bit-line pruning methods of Memristive Convolution (MC) layers. We also use the parameter distribution of the memristive neural network to further reduce the size of memristor crossbars. The Memristive Batch Normalization (MBN) layer and Memristive Dropout (MD) are merged into front MC layers according mathematical analysis for cutting the number of network layers and decreasing the power consumption of the circuit. We also design the channel optimization and layer optimization methods of MC layers which greatly reduce the negative effect of multi-state conductance of memristors on the accuracy, improve the stability of the circuit, and reduce the circuit volume and power consumption. Experiments show that this circuit can get 89.83% accuracy on the CIFAR-10 data set, and the power consumption of a single neuron is only 1.3μW. When the number of memristor multi-state conductance is 24=16, the accuracy of the MGNN circuit close to float MGNN can still be obtained.
Year
DOI
Venue
2021
10.1016/j.neucom.2021.01.122
Neurocomputing
Keywords
DocType
Volume
Edge intelligence,Memristive GoogLeNet Neural Network,Memristive convolution layer,Compact neural network,Stable circuit
Journal
441
ISSN
Citations 
PageRank 
0925-2312
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Huanhuan Ran111.03
Shiping Wen2123172.34
Kaibo Shi321325.47
Tingwen Huang45684310.24