Graph Explicit Attention Network Based on Predefined Strategy

Research output: Contribution to journalConference articlepeer-review

1 Scopus citations

Abstract

Graph attention network (GAT) has achieved great success in graph representation learning in recent years. However, the lower Physical properties of GAT in training process severely affects the application of attention mechanism in graph domain. Particularly, the attention coefficients and the multi-head mechanism are calculated or introduced in the process of model training, which obviously increases the storage complexity. In this paper, we present a graph explicit attention network (GEAT), a novel graph attention architecture that leverage predefined strategy to calculate attention coefficients, which combine global structural and node feature information. In this way, GEAT can effectively reduce the storage complexity, improve the training efficiency and alleviate the over-smoothing problem of attention scores. Experiments on benchmark datasets — Core, Citeseer and Pubmed — demonstrate that our model outperforms the state-of-the-art methods with a clear margin in challenging classifcation tasks, while being computationally efficient.

Original languageEnglish
Pages (from-to)422-429
Number of pages8
JournalProcedia Computer Science
Volume202
DOIs
StatePublished - 2022
Event12th International Conference on Identification, Information and Knowledge in the internet of Things, IIKI 2021 - Hangzhou, China
Duration: 18 Dec 202118 Dec 2021

Keywords

  • Attention mechanism
  • Deep Learning
  • Graph neural network

Fingerprint

Dive into the research topics of 'Graph Explicit Attention Network Based on Predefined Strategy'. Together they form a unique fingerprint.

Cite this