Title
Asymmetric Ternary Networks
Abstract
Deep Neural Networks (DNNs) are widely used in a variety of machine learning tasks currently, especially in speech recognition and image classification. However, the huge demand for memory and computational power makes DNNs cannot be deployed on embedded devices efficiently. In this paper, we propose asymmetric ternary networks (ATNs) - neural networks with weights constrained to ternary values (-α1,0,+α2), which can reduce the DNN models size by about 16 × compared with 32-bits full precision models. Scaling factors {α1,α2} are used to reduce the quantization loss between ternary weights and full precision weights. We compare ATNs with recently proposed ternary weight networks (TWNs) and full precision networks on CIFAR-10 and ImageNet datasets. The results show that our ATN models outperform full precision models of VGG13, VGG16 by 0.11%, 0.33% respectively on CIFAR-10. On ImageNet, our model outperforms TWN AlexNet model by 2.25% of Top-1 accuracy and has only 0.63% accuracy degradation over the fullprecision counterpart.
Year
DOI
Venue
2017
10.1109/ICTAI.2017.00021
2017 IEEE 29th International Conference on Tools with Artificial Intelligence (ICTAI)
Keywords
DocType
ISSN
Deep Neural Networks,Asymmetric Ternary Networks,Model Compression,Embedded efficient Neural Networks
Conference
1082-3409
ISBN
Citations 
PageRank 
978-1-5386-3877-4
0
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
Jie Ding15319.63
Junmin Wu232.77
Huan Wu300.34