Abstract
Nonintrusive load monitoring (NILM) constitutes a significant function of the smart grid in the future. The purpose is to ameliorate the consumption and supply of electricity by disaggregating the total load to the appliance-level load without intrusive monitoring. Recently, energy disaggregation is improved with the emergence of deep learning, but the imbalanced datasets and long sequences bring multiple difficulties to model training. The distribution of ON/OFF states and the limitation of the model lead to massive false-positive samples and undetected events. To tackle these problems, we proposed a multiscale self-attention network (MSANet) to utilize the global temporal correlation and local sequential features. Specifically, the dilated window self-attention mechanism is proposed to compute the local attention, and the multibranch structure is to exploit sequential features of different scales. Furthermore, the embedding of global temporal information is introduced to improve global contextual awareness, and subtask networks are designed for different tasks, respectively, to alleviate the effect of imbalance. The proposed model is evaluated on the reference energy disaggregation dataset (REDD) and U.K.-domestic appliance-level electricity (DALE) dataset and shows outstanding performance on the mean absolute error (MAE) and F1 score compared with baseline algorithms.
| Original language | English |
|---|---|
| Article number | 2512212 |
| Journal | IEEE Transactions on Instrumentation and Measurement |
| Volume | 72 |
| DOIs | |
| State | Published - 2023 |
UN SDGs
This output contributes to the following UN Sustainable Development Goals (SDGs)
-
SDG 7 Affordable and Clean Energy
Keywords
- Deep learning
- energy disaggregation
- nonintrusive load monitoring (NILM)
- self-attention
- sequence-to-sequence
Fingerprint
Dive into the research topics of 'Multiscale Self-Attention Architecture in Temporal Neural Network for Nonintrusive Load Monitoring'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver