Title
Demand Response Management for Industrial Facilities: A Deep Reinforcement Learning Approach.
Abstract
As a major consumer of energy, the industrial sector must assume the responsibility for improving energy efficiency and reducing carbon emissions. However, most existing studies on industrial energy management are suffering from modeling complex industrial processes. To address this issue, a model-free demand response (DR) scheme for industrial facilities was developed. In practical terms, we first formulated the Markov decision process (MDP) for industrial DR, which presents the composition of the state, action, and reward function in detail. Then, we designed an actor-critic-based deep reinforcement learning algorithm to determine the optimal energy management policy, where both the actor (Policy) and the critic (Value function) are implemented by the deep neural network. We then confirmed the validity of our scheme by applying it to a real-world industry. Our algorithm identified an optimal energy consumption schedule, reducing energy costs without compromising production.
Year
DOI
Venue
2019
10.1109/ACCESS.2019.2924030
IEEE ACCESS
Keywords
Field
DocType
Artificial intelligence,deep reinforcement learning,demand response (DR),industrial facilities,actor-critic
Load management,Energy management,Computer science,Efficient energy use,Operations research,Markov decision process,Demand response,Artificial neural network,Energy consumption,Reinforcement learning,Distributed computing
Journal
Volume
ISSN
Citations 
7
2169-3536
0
PageRank 
References 
Authors
0.34
0
5
Name
Order
Citations
PageRank
Xuefei Huang110.68
Seung Ho Hong212830.23
Mengmeng Yu3122.69
Yuemin Ding44910.43
Junhui Jiang521.45