Abstract
Multi-turbine wind power (WP) prediction contributes to wind turbine (WT) management and refined wind farm operations. However, the intricate and dynamic nature of the interrelationships among WTs hinders the full exploration of their potential in improving prediction. This paper proposes a novel spatio-positional series attention long short-term memory (SPSA-LSTM) method, which extracts the hidden correlations and temporal features from wind speed (WS) and WP historical data of different WTs for high-precision short-term prediction. Using embedding techniques, we incorporate crucial spatial location information of WTs into time series, enhancing the model's representative capability. Furthermore, we employ a self-attention mechanism with strong relational modeling capability to extract the correlation features among time series. This approach possesses remarkable learning abilities, enabling the thorough exploration of the complex interdependencies within inputs. Consequently, each WT is endowed with a comprehensive dataset comprising attention scores from all other WTs and its own WS and WP. The LSTM fuses these features and extracts temporal patterns, ultimately generating the WP prediction outputs. Experiments conducted on 20 WTs demonstrate that our method significantly surpasses other baselines. Ablation experiments provide further evidence to support the effectiveness of the approach in leveraging spatial embedding to optimize prediction performance.
| Original language | English |
|---|---|
| Article number | 016102 |
| Journal | Journal of Renewable and Sustainable Energy |
| Volume | 16 |
| Issue number | 1 |
| DOIs | |
| State | Published - 1 Jan 2024 |
UN SDGs
This output contributes to the following UN Sustainable Development Goals (SDGs)
-
SDG 7 Affordable and Clean Energy
Fingerprint
Dive into the research topics of 'Integrating spatio-positional series attention to deep network for multi-turbine short-term wind power prediction'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver