TY - GEN
T1 - Graph-based KB and Text Fusion Interaction Network for Open Domain Question Answering
AU - Ding, Yi
AU - Rao, Yuan
AU - Yang, Fan
N1 - Publisher Copyright:
© 2021 IEEE.
PY - 2021/7/18
Y1 - 2021/7/18
N2 - The incompleteness of the knowledge base (KB) limits the performance of open domain question answering (QA).Represent the incomplete KB with graph attention network (GAT) and complement the incomplete KB by extra text achieves great success to boost the QA system when the KB is incomplete. In this paper, we propose a Graph-based KB and Text Fusion Interaction Network (GTFIN) to improve the performance of the incomplete QA system by utilizing the KB and text information.In GTFIN, to reduce the influence of the query-unrelated noisy information of GAT on final answer prediction, we first design a global-normalization graph attention network (GGAT) by determining the query-related edge weights from the global perspective, and then a coarse-to-fine text reader (CFReader) is proposed to both exploit the relation information and obtain the entity mention representation in the text to enhance the incomplete KB. We further incorporate a bi-attention mechanism to enhance the interaction between question and entity representation which could find more query-related entities for final answer prediction. On the widely used KBQA benchmark WebQSP, our model achieves state-of-the-art performance.
AB - The incompleteness of the knowledge base (KB) limits the performance of open domain question answering (QA).Represent the incomplete KB with graph attention network (GAT) and complement the incomplete KB by extra text achieves great success to boost the QA system when the KB is incomplete. In this paper, we propose a Graph-based KB and Text Fusion Interaction Network (GTFIN) to improve the performance of the incomplete QA system by utilizing the KB and text information.In GTFIN, to reduce the influence of the query-unrelated noisy information of GAT on final answer prediction, we first design a global-normalization graph attention network (GGAT) by determining the query-related edge weights from the global perspective, and then a coarse-to-fine text reader (CFReader) is proposed to both exploit the relation information and obtain the entity mention representation in the text to enhance the incomplete KB. We further incorporate a bi-attention mechanism to enhance the interaction between question and entity representation which could find more query-related entities for final answer prediction. On the widely used KBQA benchmark WebQSP, our model achieves state-of-the-art performance.
KW - bi-attention
KW - coarse-to-fine text reader
KW - incomplete knowledge base
KW - open domain question answering
UR - https://www.scopus.com/pages/publications/85116492030
U2 - 10.1109/IJCNN52387.2021.9534439
DO - 10.1109/IJCNN52387.2021.9534439
M3 - 会议稿件
AN - SCOPUS:85116492030
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - IJCNN 2021 - International Joint Conference on Neural Networks, Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2021 International Joint Conference on Neural Networks, IJCNN 2021
Y2 - 18 July 2021 through 22 July 2021
ER -