A hybrid deep learning model for short-term load forecasting of distribution networks integrating the channel attention mechanism

Research output: Contribution to journalArticlepeer-review

11 Scopus citations

Abstract

Optimizing short-term load forecasting performance is a challenge due to the randomness of nonlinear power load and variability of system operation mode. The existing methods generally ignore how to reasonably and effectively combine the complementary advantages among them and fail to capture enough internal information from load data, resulting in accuracy reduction. To achieve accurate and efficient short-term load forecasting, an integral implementation framework is proposed based on convolutional neural network (CNN), gated recurrent unit (GRU) and channel attention mechanism. CNN and GRU are first combined to fully extract the highly complicated dynamic characteristics and learn time compliance relationships of load sequence. Based on CNN-GRU network, the channel attention mechanism is introduced to further reduce the loss of historical information and enhance the impact of important characteristics. Then, the overall framework of short-term load forecasting based on CNN-GRU-Attention network is proposed, and the coupling relationship between each stage is revealed. Finally, the developed framework is implemented on realistic load dataset of distribution networks, and the experimental results verify the effectiveness of the proposed method. Compared with the state-of-the-art models, the CNN-GRU-Attention model outperforms in different evaluation metrics.

Original languageEnglish
Pages (from-to)1770-1784
Number of pages15
JournalIET Generation, Transmission and Distribution
Volume18
Issue number9
DOIs
StatePublished - May 2024

Keywords

  • power system identification
  • power system parameter estimation

Fingerprint

Dive into the research topics of 'A hybrid deep learning model for short-term load forecasting of distribution networks integrating the channel attention mechanism'. Together they form a unique fingerprint.

Cite this