Title | ||
---|---|---|
Sparse BD-Net: A Multiplication-less DNN with Sparse Binarized Depth-wise Separable Convolution |
Abstract | ||
---|---|---|
In this work, we propose a multiplication-less binarized depthwise-separable convolution neural network, called BD-Net. BD-Net is designed to use binarized depthwise separable convolution block as the drop-in replacement of conventional spatial-convolution in deep convolution neural network (DNN). In BD-Net, the computation-expensive convolution operations (i.e., Multiplication and Accumulation) are converted into energy-efficient Addition/Subtraction operations. For further compressing the model size while maintaining the dominant computation in addition/subtraction, we propose a brand-new sparse binarization method with a hardware-oriented structured sparsity pattern. To successfully train such sparse BD-Net, we propose and leverage two techniques: (1) a modified group-lasso regularization whose group size is identical to the capacity of basic computing core in accelerator and (2) a weight penalty clipping technique to solve the disharmony issue between weight binarization and lasso regularization. The experiment results show that the proposed sparse BD-Net can achieve comparable or even better inference accuracy, in comparison to the full precision CNN baseline. Beyond that, a BD-Net customized process-in-memory accelerator is designed using SOT-MRAM, which owns characteristics of high channel expansion flexibility and computation parallelism. Through the detailed analysis from both software and hardware perspectives, we provide an intuitive design guidance for software/hardware co-design of DNN acceleration on mobile embedded systems. Note that this journal submission is the extended version of our previous published paper in ISVLSI 2018 [24].
|
Year | DOI | Venue |
---|---|---|
2020 | 10.1145/3369391 | ACM Journal on Emerging Technologies in Computing Systems |
Keywords | DocType | Volume |
Deep neural network,in-memory computing,model compression | Journal | 16 |
Issue | ISSN | Citations |
2 | 1550-4832 | 4 |
PageRank | References | Authors |
0.37 | 15 | 5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Zhezhi He | 1 | 136 | 25.37 |
Li Yang | 2 | 10 | 4.50 |
Shaahin Angizi | 3 | 221 | 26.13 |
Adnan Siraj Rakin | 4 | 30 | 7.89 |
Deliang Fan | 5 | 375 | 53.66 |