Title
Successive Log Quantization for Cost-Efficient Neural Networks Using Stochastic Computing
Abstract
Despite the multifaceted benefits of stochastic computing (SC) such as low cost, low power, and flexible precision, SC-based deep neural networks (DNNs) still suffer from the long-latency problem, especially for those with high precision requirements. While log quantization can be of help, it has its own accuracy-saturation problem due to uneven precision distribution. In this paper we propose successive log quantization (SLQ), which extends log quantization with significant improvements in precision and accuracy, and apply it to state-of-the-art SC-DNNs. SLQ reuses the existing datapath of log quantization, and thus retains its advantages such as simple multiplier hardware. Our experimental results demonstrate that our SLQ can significantly extend both the accuracy and efficiency of SC-DNNs over the state-of-the-art solutions, including linear-quantized and log-quantized SC-DNNs, achieving less than 1~1.5%p accuracy drop for AlexNet, SqueezeNet, and VGG-S at mere 4~5-bit weight resolution.
Year
DOI
Venue
2019
10.1145/3316781.3317916
Proceedings of the 56th Annual Design Automation Conference 2019
Keywords
Field
DocType
Adaptive precision, Deep neural network (DNN), Logarithmic quantization, Stochastic computing
Datapath,Computer science,Real-time computing,Multiplier (economics),Electronic design automation,Accuracy and precision,Artificial neural network,Quantization (signal processing),Stochastic computing,Computer engineering,Cost efficiency
Conference
ISSN
ISBN
Citations 
0738-100X
978-1-4503-6725-7
3
PageRank 
References 
Authors
0.40
3
4
Name
Order
Citations
PageRank
Sugil Lee182.60
Hyeon Uk Sim2315.19
Jooyeon Choi341.09
Jongeun Lee442933.71