Title
ADDA: Adaptive Distributed DNN Inference Acceleration in Edge Computing Environment
Abstract
Implementing intelligent mobile applications on IoT devices with DNN technology has become an inevitable trend. Due to the limitations of the size of DNN model deployed onto end devices and the instability of wide-area network transmission, either End-only mode or Cloud-only mode cannot guarantee the reasonable latency and recognition accuracy simultaneously. A better solution is to exploit the edge computing, where the existing edge computing execution framework and offloading mechanism for DNN inference suffer unnecessary computational overheads and underutilized computing capacity of end and edge. To address these shortcomings, an adaptive distributed DNN inference acceleration framework for edge computing environment is proposed in this paper, where DNN computation path optimization and DNN computation partition optimization are taken into consideration. The evaluations demonstrate that our method can effectively accelerate the DNN inference compared to the state-of-the-art methods.
Year
DOI
Venue
2019
10.1109/ICPADS47876.2019.00069
2019 IEEE 25th International Conference on Parallel and Distributed Systems (ICPADS)
Keywords
Field
DocType
DNN inference,edge computing,multi path,partition,offloading
Edge computing,Multi path,Computer science,Inference,Latency (engineering),Internet of Things,Exploit,Acceleration,Distributed computing,Computation
Conference
ISSN
ISBN
Citations 
1521-9097
978-1-7281-2584-8
3
PageRank 
References 
Authors
0.37
3
4
Name
Order
Citations
PageRank
Huitian Wang130.37
Guangxing Cai230.37
Zhaowu Huang330.37
Fang Dong420235.44