Title
Allocating Resource Capacities for an Offload-enabled Mobile Edge Cloud System
Abstract
AI applications have become increasingly popular and been widely deployed in various scenarios. In an AI application, an AI model is typically first trained, and then the model inference service is deployed in systems to perform model inference tasks based on the input data. In this paper, we systematically model the performance of running such AI tasks on Mobile Edge Cloud (MEC) systems. In particular, the model inference services are deployed in mobile devices, the edge devices and the cloud server. Mobile devices collect the monitoring data and perform the model inference tasks. When the arrival rate of the incoming data becomes too big, the mobile devices offload a portion of model inference tasks to the edge devices by the way of uploading the incoming data to the edge devices. If an edge device is overwhelmed by the tasks, it can further offload to the cloud server. This paper aims to model the offloading behaviors in MEC and also the resource capacities required to meet the desired task performance.
Year
DOI
Venue
2022
10.1109/BigDataService55688.2022.00010
2022 IEEE Eighth International Conference on Big Data Computing Service and Applications (BigDataService)
Keywords
DocType
ISBN
mobile edge computing,AI applications,task performance,queuing theory
Conference
978-1-6654-5891-7
Citations 
PageRank 
References 
0
0.34
8
Authors
2
Name
Order
Citations
PageRank
Zhiyan Chen100.68
Ligang He254256.73