RLGNet: Repeating-Local-Global History Network for Temporal Knowledge Graph Reasoning
CoRR(2024)
摘要
Temporal Knowledge Graph (TKG) reasoning involves predicting future events
based on historical information. However, due to the unpredictability of future
events, this task is highly challenging. To address this issue, we propose a
multi-scale hybrid architecture model based on ensemble learning, called RLGNet
(Repeating-Local-Global History Network). Inspired by the application of
multi-scale information in other fields, we introduce the concept of
multi-scale information into TKG reasoning. Specifically, RLGNet captures and
integrates different levels of historical information by combining modules that
process information at various scales. The model comprises three modules: the
Repeating History Module focuses on identifying repetitive patterns and trends
in historical data, the Local History Module captures short-term changes and
details, and the Global History Module provides a macro perspective on
long-term changes. Additionally, to address the limitations of previous
single-architecture models in generalizing across single-step and multi-step
reasoning tasks, we adopted architectures based on Recurrent Neural Networks
(RNN) and Multi-Layer Perceptrons (MLP) for the Local and Global History
Modules, respectively. This hybrid architecture design enables the model to
complement both multi-step and single-step reasoning capabilities. Finally, to
address the issue of noise in TKGs, we adopt an ensemble learning strategy,
combining the predictions of the three modules to reduce the impact of noise on
the final prediction results. In the evaluation on six benchmark datasets, our
approach generally outperforms existing TKG reasoning models in multi-step and
single-step reasoning tasks.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要