Title
Enabling Resistive-RAM-based Activation Functions for Deep Neural Network Acceleration
Abstract
The Resistive-RAM (RRAM) based deep neural network (DNN) accelerators have shown great potential as they are good at solving matrix-vector multiplication (MVM). However, this computing paradigm does not benefit other NN operations like activation, which may be built upon various transcendental functions and require customized circuit as in current RRAM-based NN accelerators. In this paper, we propose the RRAM-CORDIC algorithm and crossbar design which enable various transcendental activation calculations on a RRAM crossbar just like MVM. By applying encoding and multi-iteration transformation, the RRAM-CORDIC can exploit higher MAC (multiply-and-accumulation) parallelism that is traditionally uneconomic in CMOS but now efficient in RRAM crossbar. In addition, it can work in a pipelined manner with high computing throughput. Experiment results show that the RRAM-CORDIC algorithm can sustain high accuracy on different transcendental functions, and deliver less than 0.5% NN accuracy loss on typical DNN inference. The elimination of CMOS circuit in turn can trade more computing resources for MVM in the same area budget that improves the performance up to 47% for different networks.
Year
DOI
Venue
2020
10.1145/3386263.3406915
GLSVLSI '20: Great Lakes Symposium on VLSI 2020 Virtual Event China September, 2020
DocType
ISBN
Citations 
Conference
978-1-4503-7944-1
0
PageRank 
References 
Authors
0.34
0
8
Name
Order
Citations
PageRank
Zihan Zhang13610.42
Taozhong Li200.68
Ning Guan300.34
Qin Wang412.08
Guanghui He503.04
Weiguang Sheng6338.08
Zhigang Mao719941.73
Naifeng Jing815227.07