Title | ||
---|---|---|
Enabling Resistive-RAM-based Activation Functions for Deep Neural Network Acceleration |
Abstract | ||
---|---|---|
The Resistive-RAM (RRAM) based deep neural network (DNN) accelerators have shown great potential as they are good at solving matrix-vector multiplication (MVM). However, this computing paradigm does not benefit other NN operations like activation, which may be built upon various transcendental functions and require customized circuit as in current RRAM-based NN accelerators. In this paper, we propose the RRAM-CORDIC algorithm and crossbar design which enable various transcendental activation calculations on a RRAM crossbar just like MVM. By applying encoding and multi-iteration transformation, the RRAM-CORDIC can exploit higher MAC (multiply-and-accumulation) parallelism that is traditionally uneconomic in CMOS but now efficient in RRAM crossbar. In addition, it can work in a pipelined manner with high computing throughput. Experiment results show that the RRAM-CORDIC algorithm can sustain high accuracy on different transcendental functions, and deliver less than 0.5% NN accuracy loss on typical DNN inference. The elimination of CMOS circuit in turn can trade more computing resources for MVM in the same area budget that improves the performance up to 47% for different networks.
|
Year | DOI | Venue |
---|---|---|
2020 | 10.1145/3386263.3406915 | GLSVLSI '20: Great Lakes Symposium on VLSI 2020
Virtual Event
China
September, 2020 |
DocType | ISBN | Citations |
Conference | 978-1-4503-7944-1 | 0 |
PageRank | References | Authors |
0.34 | 0 | 8 |
Name | Order | Citations | PageRank |
---|---|---|---|
Zihan Zhang | 1 | 36 | 10.42 |
Taozhong Li | 2 | 0 | 0.68 |
Ning Guan | 3 | 0 | 0.34 |
Qin Wang | 4 | 1 | 2.08 |
Guanghui He | 5 | 0 | 3.04 |
Weiguang Sheng | 6 | 33 | 8.08 |
Zhigang Mao | 7 | 199 | 41.73 |
Naifeng Jing | 8 | 152 | 27.07 |