Reinforcement Learning-Based Plug-in Electric Vehicle Charging with Forecasted Price

Research output: Contribution to journalArticle

Standard

Reinforcement Learning-Based Plug-in Electric Vehicle Charging with Forecasted Price. / Chis, Adriana; Lunden, Jarmo; Koivunen, Visa.

In: IEEE Transactions on Vehicular Technology, Vol. 66, No. 5, 7553556, 2017, p. 3674-3684.

Research output: Contribution to journalArticle

Harvard

APA

Vancouver

Author

Bibtex - Download

@article{e0190581a27c405daf0c7124f186821a,
title = "Reinforcement Learning-Based Plug-in Electric Vehicle Charging with Forecasted Price",
abstract = "This paper proposes a novel demand response method that aims at reducing the long term cost of charging the battery of an individual plug-in electric vehicle (PEV). The problem is cast as a daily decision making problem for choosing the amount of energy to be charged in the PEV battery within a day. We model the problem as a Markov decision process with unknown transition probabilities. A batch reinforcement learning algorithm is proposed for learning an optimum cost reducing charging policy from a batch of transition samples and making cost reducing charging decisions in new situations. In order to capture the day-to-day differences of electricity charging costs the method makes use of actual electricity prices for the current day and predicted electricity prices for the following day. A Bayesian neural network is employed for predicting the electricity prices. For constructing the reinforcement learning training data set we use historical prices. A linear programming based method is developed for creating a data-set of optimal charging decisions. Different charging scenarios are simulated for each day of the historical time frame using the set of past electricity prices. Simulation results using real world pricing data demonstrate cost savings of 10{\%}-50{\%} for the PEV owner when using the proposed charging method.",
keywords = "cost reduction, demand response, plug-in electric vehicle, price prediction, reinforcement learning, smart charging",
author = "Adriana Chis and Jarmo Lunden and Visa Koivunen",
year = "2017",
doi = "10.1109/TVT.2016.2603536",
language = "English",
volume = "66",
pages = "3674--3684",
journal = "IEEE Transactions on Vehicular Technology",
issn = "0018-9545",
number = "5",

}

RIS - Download

TY - JOUR

T1 - Reinforcement Learning-Based Plug-in Electric Vehicle Charging with Forecasted Price

AU - Chis, Adriana

AU - Lunden, Jarmo

AU - Koivunen, Visa

PY - 2017

Y1 - 2017

N2 - This paper proposes a novel demand response method that aims at reducing the long term cost of charging the battery of an individual plug-in electric vehicle (PEV). The problem is cast as a daily decision making problem for choosing the amount of energy to be charged in the PEV battery within a day. We model the problem as a Markov decision process with unknown transition probabilities. A batch reinforcement learning algorithm is proposed for learning an optimum cost reducing charging policy from a batch of transition samples and making cost reducing charging decisions in new situations. In order to capture the day-to-day differences of electricity charging costs the method makes use of actual electricity prices for the current day and predicted electricity prices for the following day. A Bayesian neural network is employed for predicting the electricity prices. For constructing the reinforcement learning training data set we use historical prices. A linear programming based method is developed for creating a data-set of optimal charging decisions. Different charging scenarios are simulated for each day of the historical time frame using the set of past electricity prices. Simulation results using real world pricing data demonstrate cost savings of 10%-50% for the PEV owner when using the proposed charging method.

AB - This paper proposes a novel demand response method that aims at reducing the long term cost of charging the battery of an individual plug-in electric vehicle (PEV). The problem is cast as a daily decision making problem for choosing the amount of energy to be charged in the PEV battery within a day. We model the problem as a Markov decision process with unknown transition probabilities. A batch reinforcement learning algorithm is proposed for learning an optimum cost reducing charging policy from a batch of transition samples and making cost reducing charging decisions in new situations. In order to capture the day-to-day differences of electricity charging costs the method makes use of actual electricity prices for the current day and predicted electricity prices for the following day. A Bayesian neural network is employed for predicting the electricity prices. For constructing the reinforcement learning training data set we use historical prices. A linear programming based method is developed for creating a data-set of optimal charging decisions. Different charging scenarios are simulated for each day of the historical time frame using the set of past electricity prices. Simulation results using real world pricing data demonstrate cost savings of 10%-50% for the PEV owner when using the proposed charging method.

KW - cost reduction

KW - demand response

KW - plug-in electric vehicle

KW - price prediction

KW - reinforcement learning

KW - smart charging

UR - http://www.scopus.com/inward/record.url?scp=84987943631&partnerID=8YFLogxK

U2 - 10.1109/TVT.2016.2603536

DO - 10.1109/TVT.2016.2603536

M3 - Article

VL - 66

SP - 3674

EP - 3684

JO - IEEE Transactions on Vehicular Technology

JF - IEEE Transactions on Vehicular Technology

SN - 0018-9545

IS - 5

M1 - 7553556

ER -

ID: 10192014