Title
Joint Physical-Layer and System-Level Power Management for Delay-Sensitive Wireless Communications
Abstract
We consider the problem of energy-efficient point-to-point transmission of delay-sensitive data (e.g., multimedia data) over a fading channel. Existing research on this topic utilizes either physical-layer centric solutions, namely power-control and adaptive modulation and coding (AMC), or system-level solutions based on dynamic power management (DPM); however, there is currently no rigorous and unified framework for simultaneously utilizing both physical-layer centric and system-level techniques to achieve the minimum possible energy consumption, under delay constraints, in the presence of stochastic and a priori unknown traffic and channel conditions. In this paper, we propose such a framework. We formulate the stochastic optimization problem as a Markov decision process (MDP) and solve it online using reinforcement learning (RL). The advantages of the proposed online method are that 1) it does not require a priori knowledge of the traffic arrival and channel statistics to determine the jointly optimal power-control, AMC, and DPM policies; 2) it exploits partial information about the system so that less information needs to be learned than when using conventional reinforcement learning algorithms; and 3) it obviates the need for action exploration, which severely limits the adaptation speed and runtime performance of conventional reinforcement learning algorithms. Our results show that the proposed learning algorithms can converge up to two orders of magnitude faster than a state-of-the-art learning algorithm for physical layer power-control and up to three orders of magnitude faster than conventional reinforcement learning algorithms.
Year
DOI
Venue
2013
10.1109/TMC.2012.36
Mobile Computing, IEEE Transactions
Keywords
Field
DocType
Markov processes,adaptive codes,adaptive modulation,optimisation,radio networks,stochastic processes,telecommunication network management,telecommunication traffic,DPM policies,Markov decision,Markov decision process,adaptive modulation and coding,channel conditions,channel statistics,delay-sensitive data,delay-sensitive wireless communications,dynamic power management,energy-efficient point-to-point transmission,fading channel,multimedia data,optimal power-control,physical-layer centric solutions,physical-layer power management,reinforcement learning,reinforcement learning algorithms,state-of-the-art learning algorithm,stochastic optimization problem,system-level power management,traffic arrival,unknown traffic,Energy-efficient wireless communications,Markov decision process,adaptive modulation and coding,dynamic power management,power-control,reinforcement learning
Link adaptation,Stochastic optimization,Markov process,Computer science,Fading,Power control,Communication channel,Markov decision process,Computer network,Reinforcement learning,Distributed computing
Journal
Volume
Issue
ISSN
12
4
1536-1233
Citations 
PageRank 
References 
23
0.84
12
Authors
2
Name
Order
Citations
PageRank
Nicholas Mastronarde124026.93
Mihaela van der Schaar2230.84