Title
Nas4rram: Neural Network Architecture Search For Inference On Rram-Based Accelerators
Abstract
The RRAM-based accelerators enable fast and energy-efficient inference for neural networks. However, there are some requirements to deploy neural networks on RRAM-based accelerators, which are not considered in existing neural networks. (1) Because the noise problem and analog-digital converters/digital-analog converters (ADC/DAC) affect the prediction accuracy, they should be modeled in networks. (2) Because the weights are mapped to the RRAM cells, they should be quantized, and the number of weights is limited by the number of RRAM cells in the accelerator. These requirements motivate us to customize the hardware-friendly network for the RRAM-based accelerator. We take the idea of network architecture search (NAS) to design networks with high prediction accuracy that meet the requirements. We propose a framework called NAS4RRAM to search for the optimal network on the given RRAM-based accelerator. The experiments demonstrate that NAS4RRAM can apply to different RRAM-based accelerators with different scales. The performance of searched networks outperforms the manually designed ResNet.
Year
DOI
Venue
2021
10.1007/s11432-020-3245-7
SCIENCE CHINA-INFORMATION SCIENCES
Keywords
DocType
Volume
network architecture search (NAS), neural networks, RRAM-based accelerator, hardware noise, quantization
Journal
64
Issue
ISSN
Citations 
6
1674-733X
2
PageRank 
References 
Authors
0.38
0
8
Name
Order
Citations
PageRank
Zhihang Yuan121.39
Jingze Liu220.38
Xingchen Li331.83
Longhao Yan420.38
Haoxiang Chen520.38
Bingzhe Wu6186.41
Yuchao Yang744.80
Guangyu Sun81920111.55