跳到主要导航 跳到搜索 跳到主要内容

Hypergraph Foundation Model

  • Yue Gao
  • , Yifan Feng
  • , Shiquan Liu
  • , Xiangmin Han
  • , Shaoyi Du
  • , Zongze Wu
  • , Han Hu
  • Tsinghua University
  • Xi'an Jiaotong University
  • Shenzhen University
  • Beijing Institute of Technology

科研成果: 期刊稿件文章同行评审

摘要

Hypergraph neural networks (HGNNs) effectively model complex high-order relationships in domains like protein interactions and social networks by connecting multiple vertices through hyperedges, enhancing modeling capabilities, and reducing information loss. Developing foundation models for hypergraphs is challenging due to their distinct data, which includes both vertex features and intricate structural information. We present Hyper-FM, a Hypergraph Foundation Model for multi-domain knowledge extraction, featuring Hierarchical High-Order Neighbor Guided Vertex Knowledge Embedding for vertex feature representation and Hierarchical Multi-Hypergraph Guided Structural Knowledge Extraction for structural information. Additionally, we curate 11 text-attributed hypergraph datasets to advance research between HGNNs and LLMs. Experiments on these datasets show that Hyper-FM outperforms baseline methods by approximately 13.4%, validating our approach. Furthermore, we propose the first scaling law for hypergraph foundation models, demonstrating that increasing domain diversity significantly enhances performance, unlike merely augmenting vertex and hyperedge counts. This underscores the critical role of domain diversity in scaling hypergraph models.

源语言英语
期刊IEEE Transactions on Pattern Analysis and Machine Intelligence
DOI
出版状态已接受/待刊 - 2025
已对外发布

学术指纹

探究 'Hypergraph Foundation Model' 的科研主题。它们共同构成独一无二的指纹。

引用此