Title
daBNN: A Super Fast Inference Framework for Binary Neural Networks on ARM devices
Abstract
It is always well believed that Binary Neural Networks (BNNs) could drastically accelerate the inference efficiency by replacing the arithmetic operations in float-valued Deep Neural Networks (DNNs) with bit-wise operations. Nevertheless, there has not been open-source implementation in support of this idea on low-end ARM devices (e.g., mobile phones and embedded devices). In this work, we propose daBNN --- a super fast inference framework that implements BNNs on ARM devices. Several speed-up and memory refinement strategies for bit-packing, binarized convolution, and memory layout are uniquely devised to enhance inference efficiency. Compared to the recent open-source BNN inference framework, BMXNet, our daBNN is 7x~23x faster on a single binary convolution, and about 6x faster on Bi-Real Net 18 (a BNN variant of ResNet-18). The daBNN is a BSD-licensed inference framework, and its source code, sample projects and pre-trained models are available on-line: https://github.com/JDAI-CV/dabnn.
Year
DOI
Venue
2019
10.1145/3343031.3350534
Proceedings of the 27th ACM International Conference on Multimedia
Keywords
Field
DocType
binary neural networks, machine learning, open source
Computer vision,Computer science,Inference,Binary neural network,Artificial intelligence
Conference
ISBN
Citations 
PageRank 
978-1-4503-6889-6
7
0.44
References 
Authors
0
5
Name
Order
Citations
PageRank
Jianhao Zhang171.12
Yingwei Pan235723.66
Ting Yao384252.62
He Zhao4111.14
Tao Mei54702288.54