Title
Optimal Accuracy-Time Trade-off for Deep Learning Services in Edge Computing Systems
Abstract
With the increasing demand for computationally intensive services like deep learning tasks, emerging distributed computing platforms such as edge computing (EC) systems are becoming more popular. Edge computing systems have shown promising results in terms of latency reduction compared to the traditional cloud systems. However, their limited processing capacity imposes a trade-off between the potential latency reduction and the achieved accuracy in computationally-intensive services such as deep learning-based services. In this paper, we focus on finding the optimal accuracy-time trade-off for running deep learning services in a three-tier EC platform where several deep learning models with different accuracy levels are available. Specifically, we cast the problem as an Integer Linear Program, where optimal task scheduling decisions are made to maximize overall user satisfaction in terms of accuracy-time trade-off. We prove that our problem is NP-hard and then provide a polynomial constant-time greedy algorithm, called GUS, that is shown to attain near-optimal results. Finally, upon vetting our algorithmic solution through numerical experiments and comparison with a set of heuristics, we deploy it on a test-bed implemented to measure for real-world results. The results of both numerical analysis and real-world implementation show that GUS can outperform the baseline heuristics in terms of the average percentage of satisfied users by a factor of at least 50%.
Year
DOI
Venue
2021
10.1109/ICC42927.2021.9500744
IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2021)
Keywords
DocType
ISSN
Mobile edge computing, task offloading, resource management, deep learning, raspberry pi, user satisfaction, quality of experience
Conference
1550-3607
Citations 
PageRank 
References 
0
0.34
0
Authors
4
Name
Order
Citations
PageRank
Minoo Hosseinzadeh122.09
Andrew Wachal200.34
Hana Khamfroush37511.84
Daniel E. Lucani423642.29