Title
Cooperative reinforcement learning based throughput optimization in energy harvesting wireless sensor networks
Abstract
Energy Harvesting-Wireless Sensor Network (EH-WSN) has got increasing attention in recent years. During its actual deployment, we find that the energy which can be harvested from the environment is always continuous changing and unpredictable. This paper aims to investigate the energy management approach of EH-WSN under such circumstance and propose a corresponding dynamic scheme to optimize the network throughput. Here we adopt a Cooperative Reinforcement Learning (CRL) method to analysis: Firstly we model the external environment status, and then the CRL algorithm based on Q-learning starts regulating the EH-node's duty cycle according to the external energy's variation, meanwhile the feedback reward takes responsibility for the evaluation of CRL's regulation. Different from traditional reinforcement learning, CRL facilitates EH-nodes to share their local knowledge with others periodically. With this information, EH-node chooses which action to take for the current time slot: (I) idling, (II) sensing, (III) calculating, and (IV) transmitting. Experiment results show that the proposed scheme can make EH-node working energy-balanceable, and satisfying the network throughput requirement effectively, it also improves the energy utilization efficiency obviously in contrast with existing strategy.
Year
DOI
Venue
2018
10.1109/WOCC.2018.8372691
2018 27th Wireless and Optical Communication Conference (WOCC)
Keywords
Field
DocType
energy harvesting,wireless sensor network,energy management,cooperative reinforcement learning,eenergy neutral
Data collection,Energy management,Software deployment,Duty cycle,Computer science,Computer network,Throughput,Wireless sensor network,Energy consumption,Reinforcement learning
Conference
ISSN
ISBN
Citations 
2379-1268
978-1-5386-4960-2
0
PageRank 
References 
Authors
0.34
4
2
Name
Order
Citations
PageRank
Yin Wu1384.79
Kun Yang24712.60