Title | ||
---|---|---|
Enabling High-Quality Uncertainty Quantification in a PIM Designed for Bayesian Neural Network |
Abstract | ||
---|---|---|
Uncertainty quantification measures the prediction uncertainty of a neural network facing out-of-training-distribution samples. Bayesian Neural Networks (BNNs) can provide high-quality uncertainty quantification by introducing specific noise to the weights during inference. To accelerate BNN inference, ReRAM processing-in-memory (PIM) architecture is a competitive solution to provide both high-efficient computing and in-situ noise generation at the same time. However, there normally exists a huge gap between the generated noise in PIM hardware and that required by a BNN model. We demonstrate that the quality of uncertainty quantification is substantially degraded due to this gap. To solve this problem, we propose a holistic framework called W2W-PIM. We first introduce an efficient method to generate noise in ReRAM PIM design according to the demand of a BNN model. In addition, the PIM architecture is carefully modified to enable the noise generation and evaluate uncertainty quality. Moreover, a calibration unit is further introduced to reduce the noise gap caused by imperfection of the noise model. Comprehensive evaluation results demonstrate that W2W-PIM framework can achieve high-quality uncertainty quantification and high energy-efficiency at the same time. |
Year | DOI | Venue |
---|---|---|
2022 | 10.1109/HPCA53966.2022.00080 | 2022 IEEE International Symposium on High-Performance Computer Architecture (HPCA) |
Keywords | DocType | ISSN |
ReRAM,Bayesian Neural Network,Analog Computing,Noise | Conference | 1530-0897 |
ISBN | Citations | PageRank |
978-1-6654-2028-0 | 0 | 0.34 |
References | Authors | |
0 | 14 |
Name | Order | Citations | PageRank |
---|---|---|---|
Xingchen Li | 1 | 3 | 1.83 |
Bingzhe Wu | 2 | 18 | 6.41 |
Guangyu Sun | 3 | 1920 | 111.55 |
Zhe Zhang | 4 | 6 | 9.60 |
Zhihang Yuan | 5 | 0 | 0.34 |
Runsheng Wang | 6 | 169 | 21.11 |
Ru Huang | 7 | 188 | 48.74 |
Dimin Niu | 8 | 0 | 0.34 |
Hongzhong Zheng | 9 | 0 | 0.34 |
Zhichao Lu | 10 | 0 | 0.34 |
Liang Zhao | 11 | 0 | 0.34 |
Meng-Fan Chang | 12 | 459 | 45.63 |
Tianchan Guan | 13 | 0 | 0.34 |
Xin Si | 14 | 0 | 0.34 |