Title | ||
---|---|---|
Softmax Regression Design for Stochastic Computing Based Deep Convolutional Neural Networks. |
Abstract | ||
---|---|---|
Recently, Deep Convolutional Neural Networks (DCNNs) have made tremendous advances, achieving close to or even better accuracy than human-level perception in various tasks. Stochastic Computing (SC), as an alternate to the conventional binary computing paradigm, has the potential to enable massively parallel and highly scalable hardware implementations of DCNNs. In this paper, we design and optimize the SC based Softmax Regression function. Experiment results show that compared with a binary SR, the proposed SC-SR under longer bit stream can reach the same level of accuracy with the improvement of 295X, 62X, 2617X in terms of power, area and energy, respectively. Binary SR is suggested for future DCNNs with short bit stream length input whereas SC-SR is recommended for longer bit stream. |
Year | DOI | Venue |
---|---|---|
2017 | 10.1145/3060403.3060467 | ACM Great Lakes Symposium on VLSI |
Field | DocType | Citations |
Softmax function,Convolutional neural network,Massively parallel,Computer science,Algorithm,Real-time computing,Artificial intelligence,Deep learning,Bitstream,Stochastic computing,Binary number,Scalability | Conference | 5 |
PageRank | References | Authors |
0.42 | 8 | 9 |
Name | Order | Citations | PageRank |
---|---|---|---|
Zihao Yuan | 1 | 9 | 0.85 |
Ji Li | 2 | 97 | 10.87 |
Qinru Qiu | 3 | 1120 | 102.58 |
Qinru Qiu | 4 | 1120 | 102.58 |
Caiwen Ding | 5 | 142 | 26.52 |
Ao Ren | 6 | 96 | 11.53 |
Bo Yuan | 7 | 262 | 28.64 |
Jeff Draper | 8 | 298 | 26.31 |
Yanzhi Wang | 9 | 1082 | 136.11 |