Temporal Graph Transformer for Dynamic Network

ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2022, PT II(2022)

引用 3|浏览3
暂无评分
摘要
Graph neural networks (GNN) have received great attention in recent years due to their unique role in mining graph-based data. Although most work focuses on learning low-dimensional node representation in static graphs, the dynamic nature of real-world networks makes temporal graphs more practical and significant. Continuous-time dynamic graph (CTDG) is a general approach to express temporal networks in fine granularity. Owing to the high time consumption in training and inference, existing CTDG-based algorithms capture information from 1-hop neighbors, ignoring the messages from high-order neighbors, which inevitably leads to model degradation. To overcome the challenge, we propose Temporal Graph Transformer (TGT) to efficiently capture the evolving and semantic information from high-order neighborhoods in dynamic graphs. The proposed TGT consists of three modules, i.e., update module, aggregation module, and propagation module. Different from previous works that aggregate messages layer by layer, the model captures messages from 1-hop and 2-hop neighbors in a single layer. In particular, (1) the update module learns from messages derived from interactions; (2) the aggregation module aggregates 1-hop temporal neighbors to compute node embedding; (3) the propagation module re-updates the hidden state of temporal neighbors to introduce 2-hop information. Experimental results on three real-world networks demonstrate the superiority of TGT in efficacy and efficiency.
更多
查看译文
关键词
Temporal graph, Continuous-time dynamic graph, Graph neural network, Graph embedding
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要