Demand Response for Home Energy Management Using Reinforcement Learning and Artificial Neural Network
- Authors
- Lu, Renzhi; Hong, Seung Ho; Yu, Mengmeng
- Issue Date
- Nov-2019
- Publisher
- IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
- Keywords
- Artificial intelligence; reinforcement learning; artificial neural network; demand response; home energy management
- Citation
- IEEE TRANSACTIONS ON SMART GRID, v.10, no.6, pp 6629 - 6639
- Pages
- 11
- Indexed
- SCIE
SCOPUS
- Journal Title
- IEEE TRANSACTIONS ON SMART GRID
- Volume
- 10
- Number
- 6
- Start Page
- 6629
- End Page
- 6639
- URI
- https://scholarworks.bwise.kr/erica/handle/2021.sw.erica/2073
- DOI
- 10.1109/TSG.2019.2909266
- ISSN
- 1949-3053
1949-3061
- Abstract
- Ever-changing variables in the electricity market require energy management systems (EMSs) to make optimal real-time decisions adaptively. Demand response (DR) is the latest approach being used to accelerate the efficiency and stability of power systems. This paper proposes an hour-ahead DR algorithm for home EMSs. To deal with the uncertainty in future prices, a steady price prediction model based on artificial neural network is presented. In cooperation with forecasted future prices, multi-agent reinforcement learning is adopted to make optimal decisions for different home appliances in a decentralized manner. To verify the performance of the proposed energy management scheme, simulations are conducted with non-shiftable, shiftable, and controllable loads. Experimental results demonstrate that the proposed DR algorithm can handle energy management for multiple appliances, minimize user energy bills, and dissatisfaction costs, and help the user to significantly reduce its electricity cost compared with a benchmark without DR.
- Files in This Item
-
Go to Link
- Appears in
Collections - COLLEGE OF ENGINEERING SCIENCES > SCHOOL OF ELECTRICAL ENGINEERING > 1. Journal Articles

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.