Title
A Sample-Efficient OPF Learning Method Based on Annealing Knowledge Distillation
Abstract
To quickly respond to variations in the state of network load demand, a solution using data-driven techniques to predict optimal power flow (OPF) has emerged in recent years. However, most of the existing methods are highly dependent on large data volumes. This limits their application on the newly established or expanded systems. In this regard, this work proposes a sample-efficient OPF learning method to maximize the utilization of limited samples. By decomposing the OPF task before knowledge distillation, deep learning complexity is reduced. Thereafter, knowledge distillation is used to integrate decoupled tasks and improve accuracy in low-data setups. Unsupervised pre-training is introduced to alleviate the demand for labeled data. Additionally, the focal loss function and teacher annealing strategy are adopted to achieve higher accuracy without extra samples. Numerical tests on different systems corroborate the advanced accuracy and training speed over other training methods, especially on fewer-sample occasions.
Year
DOI
Venue
2022
10.1109/ACCESS.2022.3207146
IEEE ACCESS
Keywords
DocType
Volume
Encoding, Knowledge engineering, Load modeling, Deep learning, Data models, Power generation, Annealing, Noise reduction, Optimal control, Annealing, Optimal power flow, sample efficiency, annealing knowledge distillation, focal loss function, stacked denoising autoencoder, deep learning
Journal
10
ISSN
Citations 
PageRank 
2169-3536
0
0.34
References 
Authors
0
6
Name
Order
Citations
PageRank
Ziheng Dong100.68
Kai Hou212.05
Zeyu Liu300.68
Xiaodan Yu412.39
Hong Jie Jia51412.35
Chi Zhang619240.36