Title
Tp-Admm: An Efficient Two-Stage Framework For Training Binary Neural Networks
Abstract
Deep Neural Networks (DNNs) are very powerful and successful but suffer from high computation and memory cost. As a useful attempt, binary neural networks represent weights and activations with binary values, which can significantly reduce resource consumption. However, the simultaneous binarization introduces the coupling effect, aggravating the difficulty of training. In this paper, we develop a novel framework named TP-ADMM that decouples the binarization process into two iteratively optimized stages. Firstly, we propose an improved target propagation method to optimize the network with binary activations in a more stable format. Secondly, we apply the alternating direction method (ADMM) with a varying penalty to get the weights binarized, making weights binarization a discretely constrained optimization problem. Experiments on three public datasets for image classification show that the proposed method outperforms the existing methods.
Year
DOI
Venue
2019
10.1007/978-3-030-36808-1_63
NEURAL INFORMATION PROCESSING (ICONIP 2019), PT IV
Keywords
DocType
Volume
Binary neural network, ADMM, Target propagation
Conference
1142
ISSN
Citations 
PageRank 
1865-0929
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Yong Yuan123931.09
Chen Chen203.38
Xiyuan Hu310819.03
S. Peng433240.36