Abstract
Graph attention network (GAT) has achieved great success in graph representation learning in recent years. However, the lower Physical properties of GAT in training process severely affects the application of attention mechanism in graph domain. Particularly, the attention coefficients and the multi-head mechanism are calculated or introduced in the process of model training, which obviously increases the storage complexity. In this paper, we present a graph explicit attention network (GEAT), a novel graph attention architecture that leverage predefined strategy to calculate attention coefficients, which combine global structural and node feature information. In this way, GEAT can effectively reduce the storage complexity, improve the training efficiency and alleviate the over-smoothing problem of attention scores. Experiments on benchmark datasets — Core, Citeseer and Pubmed — demonstrate that our model outperforms the state-of-the-art methods with a clear margin in challenging classifcation tasks, while being computationally efficient.
| Original language | English |
|---|---|
| Pages (from-to) | 422-429 |
| Number of pages | 8 |
| Journal | Procedia Computer Science |
| Volume | 202 |
| DOIs | |
| State | Published - 2022 |
| Event | 12th International Conference on Identification, Information and Knowledge in the internet of Things, IIKI 2021 - Hangzhou, China Duration: 18 Dec 2021 → 18 Dec 2021 |
Keywords
- Attention mechanism
- Deep Learning
- Graph neural network
Fingerprint
Dive into the research topics of 'Graph Explicit Attention Network Based on Predefined Strategy'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver