Abstract | ||
---|---|---|
The return on investment of a battery system is maximized if the battery control strategy is appropriately matched to the operating environment (e.g., pricing scheme, electrical load). For residential battery systems, the current practice is to statically determine the control policy prior to system installation; the battery subsequently spends upwards of 10 years operating in a dynamic environment. A state-of-the-art model predictive controller (MPC) can adapt to changes in the system, but is limited by its high online computational requirements. To better extract value at a reasonable online computational cost, we propose an adaptive battery controller framework that learns a control strategy by encoding an MPC policy in a neural network, as data becomes available, to adapt the control to the operating environment. We evaluate our controller in the context of a solar PV-storage system deployed in Texas under a time-of-use pricing scheme. We find that our controller gets to within 5-10% of optimal performance, and outperforms a default control strategy for PV-storage systems within a few months of installation. |
Year | DOI | Venue |
---|---|---|
2019 | 10.1145/3307772.3331032 | E-ENERGY'19: PROCEEDINGS OF THE 10TH ACM INTERNATIONAL CONFERENCE ON FUTURE ENERGY SYSTEMS |
Keywords | Field | DocType |
Deep neural networks,battery control,adaptive system,model predictive control | Computer science,Control engineering,Battery (electricity),Artificial neural network | Conference |
Citations | PageRank | References |
3 | 0.48 | 0 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Fiodar Kazhamiaka | 1 | 4 | 1.19 |
Srinivasan Keshav | 2 | 3778 | 761.32 |
Catherine Rosenberg | 3 | 1877 | 137.46 |