TY - GEN
T1 - Contrastive Graph Representations for Logical Formulas Embedding (Extended Abstract)
AU - Lin, Qika
AU - Liu, Jun
AU - Zhang, Lingling
AU - Pan, Yudai
AU - Hu, Xin
AU - Xu, Fangzhi
AU - Zeng, Hongwei
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Embedding symbolic logical formulas into a low-dimensional continuous space provides an effective way for the Neural-Symbolic system. However, current studies are all constrained by the syntactic structure modeling and fail to preserve intrinsic semantics. To this end, we propose a novel model of Contrastive Graph Representations (ConGR) for logical formulas embedding. Firstly, it introduces a densely connected graph convolutional network (GCN) with an attention mechanism to process syntax parsing graphs of formulas. Secondly, the contrastive instances for each anchor formula are generated by the transformation under the guidance of logical properties. Two types of contrast, global-local and global-global, are carried out to refine formula embeddings with semantic information. Extensive experiments demonstrate that ConGR obtains superior performance against state-of-the-art baselines.
AB - Embedding symbolic logical formulas into a low-dimensional continuous space provides an effective way for the Neural-Symbolic system. However, current studies are all constrained by the syntactic structure modeling and fail to preserve intrinsic semantics. To this end, we propose a novel model of Contrastive Graph Representations (ConGR) for logical formulas embedding. Firstly, it introduces a densely connected graph convolutional network (GCN) with an attention mechanism to process syntax parsing graphs of formulas. Secondly, the contrastive instances for each anchor formula are generated by the transformation under the guidance of logical properties. Two types of contrast, global-local and global-global, are carried out to refine formula embeddings with semantic information. Extensive experiments demonstrate that ConGR obtains superior performance against state-of-the-art baselines.
KW - Contrastive Learning
KW - Graph Representation
KW - Logical Formulas Embed-ding
UR - https://www.scopus.com/pages/publications/85200509765
U2 - 10.1109/ICDE60146.2024.00490
DO - 10.1109/ICDE60146.2024.00490
M3 - 会议稿件
AN - SCOPUS:85200509765
T3 - Proceedings - International Conference on Data Engineering
SP - 5717
EP - 5718
BT - Proceedings - 2024 IEEE 40th International Conference on Data Engineering, ICDE 2024
PB - IEEE Computer Society
T2 - 40th IEEE International Conference on Data Engineering, ICDE 2024
Y2 - 13 May 2024 through 17 May 2024
ER -