Title
TreeNet: A Hierarchical Deep Learning Model to Facilitate Edge Intelligence for Resource-Constrained Devices
Abstract
Deep learning has achieved remarkable successes in various areas such as computer vision and natural language processing. Many sophisticated models have been proposed to improve performance by designing a significant number of layers of neurons. As an emerging research area, edge intelligence tries to bring intelligence to the network edge by integrating edge computing and AI technologies and it has gained wide attention for its lower latency and better privacy preservation features. Nevertheless, training and inferencing deep neural networks require intensive computation power and time, making it quite challenging to run the models on the resource-constrained edge devices. In this paper, we propose a deep learning model, namely TreeNet, based on task decomposition. After obtaining a task, we would not fit the entire task but decompose the task into disjoint sub-tasks to reduce the complexity of the required deep learning model (it could be divided multiple times if necessary). We first fit the original dataset mapping to different sub-tasks and then fit the mapping of each sub-task to the category of the original dataset that it contains. During the running of the model, we dynamically call the low-level classifier based on the inference result of the high-level classifier. When the inference result of the high-level classifier is unreliable, we send the input sample to the cloud server for processing. We use several popular datasets to study the TreeNet architecture and show that it can process most of the input data while achieving high inference accuracy and significantly decreasing the total amount of calculation.
Year
DOI
Venue
2021
10.1109/CCGrid51090.2021.00062
2021 IEEE/ACM 21st International Symposium on Cluster, Cloud and Internet Computing (CCGrid)
Keywords
DocType
ISBN
edge computing,deep learning,resource-constrained,model compression,model acceleration,edge intelligence
Conference
978-1-7281-9587-2
Citations 
PageRank 
References 
0
0.34
0
Authors
4
Name
Order
Citations
PageRank
Dong Lu100.34
Yanlong Zhai2926.38
Jianqing Wu300.34
Jun Shen4208.82