Title
Regularizing Activation Distribution For Training Binarized Deep Networks
Abstract
Binarized Neural Networks (BNNs) can significantly reduce the inference latency and energy consumption in resource-constrained devices due to their pure-logical computation and fewer memory accesses. However, training BNNs is difficult since the activation flow encounters degeneration, saturation, and gradient mismatch problems. Prior work alleviates these issues by increasing activation bits and adding floating-point scaling factors, thereby sacrificing BNN's energy efficiency. In this paper, we propose to use distribution loss to explicitly regularize the activation flow, and develop a framework to systematically formulate the loss. Our experiments show that the distribution loss can consistently improve the accuracy of BNNs without losing their energy benefits. Moreover, equipped with the proposed regularization, BNN training is shown to be robust to the selection of hyper-parameters including optimizer and learning rate.
Year
DOI
Venue
2019
10.1109/CVPR.2019.01167
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019)
Field
DocType
Volume
Pattern recognition,Efficient energy use,Latency (engineering),Computer science,Inference,Regularization (mathematics),Artificial intelligence,Artificial neural network,Energy consumption,Scaling,Computation
Journal
abs/1904.02823
ISSN
Citations 
PageRank 
1063-6919
8
0.42
References 
Authors
0
4
Name
Order
Citations
PageRank
Ruizhou Ding1425.98
Ting-Wu Chin2265.66
Zeye Liu3103.15
Diana Marculescu42725223.87