Reinforcement Learning-Based Plug-in Electric Vehicle Charging with Forecasted Price
Research output: Contribution to journal › Article › Scientific › peer-review
This paper proposes a novel demand response method that aims at reducing the long term cost of charging the battery of an individual plug-in electric vehicle (PEV). The problem is cast as a daily decision making problem for choosing the amount of energy to be charged in the PEV battery within a day. We model the problem as a Markov decision process with unknown transition probabilities. A batch reinforcement learning algorithm is proposed for learning an optimum cost reducing charging policy from a batch of transition samples and making cost reducing charging decisions in new situations. In order to capture the day-to-day differences of electricity charging costs the method makes use of actual electricity prices for the current day and predicted electricity prices for the following day. A Bayesian neural network is employed for predicting the electricity prices. For constructing the reinforcement learning training data set we use historical prices. A linear programming based method is developed for creating a data-set of optimal charging decisions. Different charging scenarios are simulated for each day of the historical time frame using the set of past electricity prices. Simulation results using real world pricing data demonstrate cost savings of 10%-50% for the PEV owner when using the proposed charging method.
|Journal||IEEE Transactions on Vehicular Technology|
|Early online date||2016|
|Publication status||Published - 2017|
|MoE publication type||A1 Journal article-refereed|
- cost reduction, demand response, plug-in electric vehicle, price prediction, reinforcement learning, smart charging