Title
SkippyNN: An Embedded Stochastic-Computing Accelerator for Convolutional Neural Networks
Abstract
Employing convolutional neural networks (CNNs) in embedded devices seeks novel low-cost and energy efficient CNN accelerators. Stochastic computing (SC) is a promising low-cost alternative to conventional binary implementations of CNNs. Despite the low-cost advantage, SC-based arithmetic units suffer from prohibitive execution time due to processing long bit-streams. In particular, multiplication as the main operation in convolution computation, is an extremely time-consuming operation which hampers employing SC methods in designing embedded CNNs. In this work, we propose a novel architecture, called SkippyNN, that reduces the computation time of SC-based multiplications in the convolutional layers of CNNs. Each convolution in a CNN is composed of numerous multiplications where each input value is multiplied by a weight vector. Producing the result of the first multiplication, the following multiplications can be performed by multiplying the input and the differences of the successive weights. Leveraging this property, we develop a differential Multiply-and-Accumulate unit, called DMAC, to reduce the time consumed by convolutions in SkippyNN. We evaluate the efficiency of SkippyNN using four modern CNNs. On average, SkippyNN ofers 1.2x speedup and 2.7x energy saving compared to the binary implementation of CNN accelerators.
Year
DOI
Venue
2019
10.1145/3316781.3317911
Proceedings of the 56th Annual Design Automation Conference 2019
Field
DocType
ISBN
Convolutional neural network,Convolution,Computer science,Efficient energy use,Weight,Electronic engineering,Multiplication,Computational science,Stochastic computing,Binary number,Speedup
Conference
978-1-4503-6725-7
Citations 
PageRank 
References 
8
0.48
0
Authors
7
Name
Order
Citations
PageRank
Reza Hojabr180.48
Givaki Kamyar2101.53
S. M. Reza Tayaranian380.48
Parsa Esfahanian480.48
Ahmad Khonsari521042.43
Dara Rahmati6586.65
M. Hassan Najafi78811.06