Title
Towards Efficient Compact Network Training on Edge-Devices
Abstract
Currently, there is a trend to deploy training on edge devices, which is crucial to future AI applications in various scenarios with transfer and online learning demands. Specifically, there may be a severe degradation of accuracy when directly deploying the trained models on edge devices, because the local environment forms an edge local dataset that is often different from the generic dataset. However, training on edge devices with limited computing and memory capability is a challenge problem. In this paper, we propose a novel quantization training framework for efficient compact network training on edge devices. Firstly, training-aware symmetric quantization is introduced to quantize all of the data types in the training process. Then, channel-wise quantization method is adopted for comapact network quantization, which has significantly high tolerance to quantization errors and can make the training process more stable. For further efficient training, we build a hardware evaluation platform to evaluate different settings of the network, so as to achieve a better trade-off among accuracy, energy and latency. Finally, we evaluate two widely used compact networks on a domain adaptation dataset for image classification, and the results demonstrate that the proposed methods can allow us achieve an improvement of 8.4 × -17.2× in energy reduction and 11.9 × -16.3× in latency reduction compared with 32-bit implementations, while maintaining the classification accuracy.
Year
DOI
Venue
2019
10.1109/ISVLSI.2019.00020
2019 IEEE Computer Society Annual Symposium on VLSI (ISVLSI)
Keywords
Field
DocType
training,compact network,quantization
Convolution,Computer science,Data type,Edge device,Memory management,Contextual image classification,Quantization (signal processing),Backpropagation,Computer engineering,Applications of artificial intelligence
Conference
ISSN
ISBN
Citations 
2159-3469
978-1-7281-3392-8
0
PageRank 
References 
Authors
0.34
3
4
Name
Order
Citations
PageRank
Feng Xiong151.54
Fengbin Tu2718.62
shouyi yin357999.95
Shaojun Wei4555102.32