Title
Q-Learning-Based Supervisory Control Adaptability Investigation for Hybrid Electric Vehicles
Abstract
As one of adaptive optimal controls, the Q-learning based supervisory control for hybrid electric vehicle (HEV) energy management is rarely studied for its adaptability. In real-world driving scenarios, conditions such as vehicle loads, road conditions and traffic conditions may vary. If these changes occur and the vehicle supervisory control does not adapt to it, the resulting fuel economy may not be optimal. To our best knowledge, for the first time, the study investigates the adaptability of Q-learning based supervisory control for HEVs. A comprehensive analysis is presented for the adaptability interpretation with three varying factors: driving cycle, vehicle load condition, and road grade. A parallel HEV architecture is considered and Q-learning is used as the reinforcement learning algorithm to control the torque split between the engine and the electric motor. Model Predictive Control, Equivalent consumption minimization strategy and thermostatic control strategy are implemented for comparison. The Q-learning based supervisory control shows strong adaptability under different conditions, and it leads the fuel economy among four supervisory controls in all three varying conditions.
Year
DOI
Venue
2022
10.1109/TITS.2021.3062179
IEEE Transactions on Intelligent Transportation Systems
Keywords
DocType
Volume
Reinforcement learning,Q-learning,supervisory control,hybrid electric vehicle,real-time implementation
Journal
23
Issue
ISSN
Citations 
7
1524-9050
0
PageRank 
References 
Authors
0.34
5
7
Name
Order
Citations
PageRank
Bin Xu100.34
Xiaolin Tang200.34
Xiaosong Hu300.34
Xianke Lin431.07
Huayi Li501.01
Dhruvang Rathod600.34
Zhe Wang700.34