Publications

CASA: cost-effective EV charging scheduling based on deep reinforcement learning

Zhang, Ao; Liu, Qingzhi; Liu, Jinwei; Cheng, Long

Summary

With the widespread adoption of electric vehicles (EVs), the demand for public charging services is steadily increasing. Consequently, the development of effective charging scheduling strategies, aimed at optimizing the utilization of limited charging infrastructure, has become a key problem. Considering the diversity of user demands, we propose a Cost-Aware Charging Scheduling Architecture (CASA). This architecture considers both urgent and nonurgent charging customers by designing two charging modes with different power levels and associated costs. However, optimizing multiple objectives simultaneously while ensuring the interests of all parties involved in the charging demand response presents a challenge. Moreover, the uncertainty in customer charging demands and Time-of-Use (TOU) tariff further complicates the establishment of the model. To address the aforementioned challenges, this study formulates EV charging scheduling as a Markov Decision Process (MDP) based on deep reinforcement learning (DRL), employing the Deep Q-Network (DQN) algorithm for solution derivation. The objective is to minimize the operational costs of charging stations while ensuring the quality of service (QoS) requirements for customers. The simulation results demonstrate that CASA exhibits superior performance in optimizing both the average response time and service success rate, compared to commonly used baselines for charging scheduling. Furthermore, the CASA approach achieves a significant reduction in operating costs of EV charging station.