TY - JOUR
T1 - Self-Supervised Hypergraph Training Framework via Structure-Aware Learning
AU - Feng, Yifan
AU - Liu, Shiquan
AU - Ying, Shihui
AU - Du, Shaoyi
AU - Wu, Zongze
AU - Gao, Yue
N1 - Publisher Copyright:
© 1979-2012 IEEE.
PY - 2025
Y1 - 2025
N2 - Hypergraphs, with their ability to model complex, beyond pair-wise correlations, presents a significant advancement over traditional graphs for capturing intricate relational data across diverse domains. However, the integration of hypergraphs into self-supervised learning (SSL) frameworks has been hindered by the intricate nature of high-order structural variations. This paper introduces the Self-Supervised Hypergraph Training Framework via Structure-Aware Learning (SS-HT), designed to enhance the perception and measurement of these variations within hypergraphs. The SS-HT framework employs a “Masking and Re-Masking” strategy to bolster feature reconstruction in Hypergraph Neural Networks (HGNNs), addressing the limitations of traditional SSL methods. It also introduces a metric strategy for local high-order correlation changes, streamlining the computational efficiency of structural distance calculations. Extensive experiments on 11 datasets demonstrate SS-HT’s superior performance over existing SSL methods for both low-order and high-order data. Notably, the framework significantly reduces data labeling dependency, achieving a 32% improvement over HGNN in the downstream task fine-tuning phase under the 1% labeled data setting in the Cora-CC dataset. Ablation studies further validate SS-HT’s scalability and its capacity to augment the performance of various HGNN methods, underscoring its robustness and applicability in real-world scenarios.
AB - Hypergraphs, with their ability to model complex, beyond pair-wise correlations, presents a significant advancement over traditional graphs for capturing intricate relational data across diverse domains. However, the integration of hypergraphs into self-supervised learning (SSL) frameworks has been hindered by the intricate nature of high-order structural variations. This paper introduces the Self-Supervised Hypergraph Training Framework via Structure-Aware Learning (SS-HT), designed to enhance the perception and measurement of these variations within hypergraphs. The SS-HT framework employs a “Masking and Re-Masking” strategy to bolster feature reconstruction in Hypergraph Neural Networks (HGNNs), addressing the limitations of traditional SSL methods. It also introduces a metric strategy for local high-order correlation changes, streamlining the computational efficiency of structural distance calculations. Extensive experiments on 11 datasets demonstrate SS-HT’s superior performance over existing SSL methods for both low-order and high-order data. Notably, the framework significantly reduces data labeling dependency, achieving a 32% improvement over HGNN in the downstream task fine-tuning phase under the 1% labeled data setting in the Cora-CC dataset. Ablation studies further validate SS-HT’s scalability and its capacity to augment the performance of various HGNN methods, underscoring its robustness and applicability in real-world scenarios.
KW - High-order learning
KW - high-order structure distance metric
KW - self-supervised learning
UR - https://www.scopus.com/pages/publications/105012492296
U2 - 10.1109/TPAMI.2025.3594487
DO - 10.1109/TPAMI.2025.3594487
M3 - 文章
C2 - 40742849
AN - SCOPUS:105012492296
SN - 0162-8828
VL - 47
SP - 10160
EP - 10176
JO - IEEE Transactions on Pattern Analysis and Machine Intelligence
JF - IEEE Transactions on Pattern Analysis and Machine Intelligence
IS - 11
ER -