Talking-heads attention-based knowledge representation for link prediction

COMPUTER SPEECH AND LANGUAGE(2022)

引用 0|浏览7
暂无评分
摘要
State-of-the-art methods for link prediction, also known as knowledge graph embedding, aim to represent both entities and relations in the given knowledge graphs (KGs) into a continuous low-dimensional vector space and thus could be used to fill the missing facts or identify the spurious facts in KGs, where a fact is represented as a triple in the form of (head entity, relation, tail entity). Most previous attempts solely learn triples independently and thus fail to utilize the rich hidden inference and semantic information in the local neighbourhood existed surrounding each triple in KGs. To this effect, this paper proposes a talking-heads attention-based knowledge representation method, a novel graph attention networks-based method for link prediction which learns the knowledge graph embedding with talking-heads attention guidance from multi hop neighbourhood triples. We evaluate our model in Freebase, WordNet and Kinship datasets on link prediction, experimental results demonstrate that the injection of talking-heads attention mechanism could better capture the semantic relationship of the neighbourhood surrounding triples and indeed achieve promising performance on link prediction.
更多
查看译文
关键词
Knowledge representation,Link prediction,Talking-heads attention
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要