Isomorphic-Consistent Variational Graph Auto-Encoders for Multi-Level Graph Representation Learning
CoRR(2023)
摘要
Graph representation learning is a fundamental research theme and can be
generalized to benefit multiple downstream tasks from the node and link levels
to the higher graph level. In practice, it is desirable to develop
task-agnostic general graph representation learning methods that are typically
trained in an unsupervised manner. Related research reveals that the power of
graph representation learning methods depends on whether they can differentiate
distinct graph structures as different embeddings and map isomorphic graphs to
consistent embeddings (i.e., the isomorphic consistency of graph models).
However, for task-agnostic general graph representation learning, existing
unsupervised graph models, represented by the variational graph auto-encoders
(VGAEs), can only keep the isomorphic consistency within the subgraphs of 1-hop
neighborhoods and thus usually manifest inferior performance on the more
difficult higher-level tasks. To overcome the limitations of existing
unsupervised methods, in this paper, we propose the Isomorphic-Consistent VGAE
(IsoC-VGAE) for multi-level task-agnostic graph representation learning. We
first devise a decoding scheme to provide a theoretical guarantee of keeping
the isomorphic consistency under the settings of unsupervised learning. We then
propose the Inverse Graph Neural Network (Inv-GNN) decoder as its intuitive
realization, which trains the model via reconstructing the GNN node embeddings
with multi-hop neighborhood information, so as to maintain the high-order
isomorphic consistency within the VGAE framework. We conduct extensive
experiments on the representative graph learning tasks at different levels,
including node classification, link prediction and graph classification, and
the results verify that our proposed model generally outperforms both the
state-of-the-art unsupervised methods and representative supervised methods.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要