Title
ReBNet: Residual Binarized Neural Network
Abstract
This paper proposes ReBNet, an end-to-end framework for training reconfigurable binary neural networks on software and developing efficient accelerators for execution on FPGA. Binary neural networks offer an intriguing opportunity for deploying large-scale deep learning models on resource-constrained devices. Binarization reduces the memory footprint and replaces the power-hungry matrix-multiplication with light-weight XnorPopcount operations. However, binary networks suffer from a degraded accuracy compared to their fixed-point counterparts. We show that the state-of-the-art methods for optimizing binary networks accuracy, significantly increase the implementation cost and complexity. To compensate for the degraded accuracy while adhering to the simplicity of binary networks, we devise the first reconfigurable scheme that can adjust the classification accuracy based on the application. Our proposition improves the classification accuracy by representing features with multiple levels of residual binarization. Unlike previous methods, our approach does not exacerbate the area cost of the hardware accelerator. Instead, it provides a tradeoff between throughput and accuracy while the area overhead of multi-level binarization is negligible.
Year
DOI
Venue
2018
10.1109/FCCM.2018.00018
2018 IEEE 26th Annual International Symposium on Field-Programmable Custom Computing Machines (FCCM)
Keywords
DocType
Volume
Deep neural networks,Reconfigurable computing,Domain customized computing,Binary neural network,Residual binarization
Conference
abs/1711.01243
ISBN
Citations 
PageRank 
978-1-5386-5523-8
7
0.50
References 
Authors
20
3
Name
Order
Citations
PageRank
Mohammad Ghasemzadeh1172.73
Mohammad Samragh2387.01
Farinaz Koushanfar33055268.84