Title
Markov Decision Process-Based Resilience Enhancement for Distribution Systems: An Approximate Dynamic Programming Approach
Abstract
Because failures in distribution systems caused by extreme weather events directly result in consumers’ outages, this paper proposes a state-based decision-making model with the objective of mitigating loss of load to improve the distribution system resilience throughout the unfolding events. The system topologies including on/off states of feeder lines are modeled as Markov states, and the probabilities from one Markov state to another Markov state throughout the unfolding events are determined by the component failure caused by the unfolding events. A recursive optimization model based on Markov decision processes (MDP) is developed to make state-based actions, i.e., system reconfiguration, at each decision time. To overcome the curse of dimensionality caused by enormous states and actions, an approximate dynamic programming (ADP) approach based on post-decision states and iteration is used to solve the proposed MDP-based model. IEEE 33-bus system and IEEE 123-bus system are used to validate the proposed model.
Year
DOI
Venue
2020
10.1109/TSG.2019.2956740
IEEE Transactions on Smart Grid
Keywords
DocType
Volume
Markov processes,Load modeling,Meteorology,Topology,Power systems,Resilience,Fault tolerance
Journal
11
Issue
ISSN
Citations 
3
1949-3053
0
PageRank 
References 
Authors
0.34
0
6
Name
Order
Citations
PageRank
Wang Chong13515.66
Ping Ju210.72
shunbo lei3205.52
Zhaoyu Wang45915.73
Feng Wu5162.70
Yunhe Hou611422.07