Graph Attention Mechanism With Cardinality Preservation For Knowledge Graph Completion

KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT I(2021)

引用 1|浏览2
暂无评分
摘要
Embedding knowledge graph with graph attention network has become a novel research topic in the field of knowledge graph completion. However, the current graph attention network generates the same embeddings for different structures and different entities when generating entity embeddings for knowledge graph. The quality of embedding directly contributes the effective of completion. We analyze the reason why graph attention network cannot distinguish structure, because the aggregation based on attention GNN ignores the cardinality information, which is the mapping of diverse features, and helps to distinguish the contributions of different nodes in the domain. Therefore, we propose the graph attention preserving (KBCPA) model. Cardinality information is added into the attentional mechanism based aggregation to generate different representations for different entities, thus improving the discrimination ability of the model. Our experiments present that our model is effective and competitive, and it obtains better performance than previous state-of-the-art embedding models for knowledge graph completion on two benchmark datasets WN18RR and FB15k-237.
更多
查看译文
关键词
Cardinality preservation, Knowledge graph, Graph attention mechanism
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要