TY - JOUR
T1 - Novel Architecture of Energy Management Systems Based on Deep Reinforcement Learning in Microgrid
AU - Lee, Seongwoo
AU - Seon, Joonho
AU - Sun, Young Ghyu
AU - Kim, Soo Hyun
AU - Kyeong, Chanuk
AU - Kim, Dong In
AU - Kim, Jin Young
N1 - Publisher Copyright:
© 2010-2012 IEEE.
PY - 2024/3/1
Y1 - 2024/3/1
N2 - In microgrids, energy management systems (EMS) have been considered essential systems to optimize energy scheduling, control and operation for reliable power systems. Conventional EMS researches have been predominantly performed by employing demand-side management and demand response (DR). Nonetheless, multi-action control in EMS is confronted with operational challenges in terms of the profitability and stability. In this paper, energy information systems (EIS), energy storage systems (ESS), energy trading risk management systems (ETRMS), and automatic DR (ADR) are integrated to efficiently manage the profitability and stability of the whole EMS by optimal energy scheduling. The proposed microgrid EMS architecture is optimized by using proximal policy optimization (PPO) algorithm, which has been known to have good performance in terms of learning stability and complexity. A novel performance metric, represented as a burden of load and generation (BoLG), is proposed to evaluate the energy management performance. The BoLG is incorporated into the reward settings for optimizing the management of multi-action controls such as load shifting, energy charging-discharging, and transactions. From the simulation results, it is confirmed that the proposed architecture can improve energy management performance with the proper trade-off between stability and profitability, compared to dynamic programming (DP)-based and double deep Q-network (DDQN)-based operation.
AB - In microgrids, energy management systems (EMS) have been considered essential systems to optimize energy scheduling, control and operation for reliable power systems. Conventional EMS researches have been predominantly performed by employing demand-side management and demand response (DR). Nonetheless, multi-action control in EMS is confronted with operational challenges in terms of the profitability and stability. In this paper, energy information systems (EIS), energy storage systems (ESS), energy trading risk management systems (ETRMS), and automatic DR (ADR) are integrated to efficiently manage the profitability and stability of the whole EMS by optimal energy scheduling. The proposed microgrid EMS architecture is optimized by using proximal policy optimization (PPO) algorithm, which has been known to have good performance in terms of learning stability and complexity. A novel performance metric, represented as a burden of load and generation (BoLG), is proposed to evaluate the energy management performance. The BoLG is incorporated into the reward settings for optimizing the management of multi-action controls such as load shifting, energy charging-discharging, and transactions. From the simulation results, it is confirmed that the proposed architecture can improve energy management performance with the proper trade-off between stability and profitability, compared to dynamic programming (DP)-based and double deep Q-network (DDQN)-based operation.
KW - deep reinforcement learning
KW - demand response
KW - energy management systems
KW - Microgrid
KW - optimal power flow
UR - https://www.scopus.com/pages/publications/85173033621
U2 - 10.1109/TSG.2023.3317096
DO - 10.1109/TSG.2023.3317096
M3 - Article
AN - SCOPUS:85173033621
SN - 1949-3053
VL - 15
SP - 1646
EP - 1658
JO - IEEE Transactions on Smart Grid
JF - IEEE Transactions on Smart Grid
IS - 2
ER -