Title
Optimization of lightweight task offloading strategy for mobile edge computing based on deep reinforcement learning
Abstract
With the maturity of 5G technology and the popularity of intelligent terminal devices, the traditional cloud computing service model cannot deal with the explosive growth of business data quickly. Therefore, the purpose of mobile edge computing (MEC) is to effectively solve problems such as latency and network load. In this paper, deep reinforcement learning (DRL) is first proposed to solve the offloading problem of multiple service nodes for the cluster and multiple dependencies for mobile tasks in large-scale heterogeneous MEC. Then the paper uses the LSTM network layer and the candidate network set to improve the DQN algorithm in combination with the actual environment of the MEC. Finally, the task offloading problem is simulated by using iFogSim and Google Cluster Trace. The simulation results show that the offloading strategy based on the improved IDRQN algorithm has better performance in energy consumption, load balancing, latency and average execution time than other algorithms.
Year
DOI
Venue
2020
10.1016/j.future.2019.07.019
Future Generation Computer Systems
Keywords
Field
DocType
Mobile edge computing,Task offloading,Deep reinforcement learning,LSTM network,Candidate network
Computer science,Load balancing (computing),Latency (engineering),Network layer,Mobile edge computing,Energy consumption,Business data,Distributed computing,Cloud computing,Reinforcement learning
Journal
Volume
ISSN
Citations 
102
0167-739X
7
PageRank 
References 
Authors
0.57
0
5
Name
Order
Citations
PageRank
Haifeng Lu181.26
Chunhua Gu292.69
Fei Luo3132.42
Weichao Ding482.95
Xinping Liu570.57