Parameterized Hypercomplex Graph Neural Networks for Graph Classification

ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT III(2021)

Cited 29|Views19
No score
Abstract
Despite recent advances in representation learning in hyper-complex (HC) space, this subject is still vastly unexplored in the context of graphs. Motivated by the complex and quaternion algebras, which have been found in several contexts to enable effective representation learning that inherently incorporates a weight-sharing mechanism, we develop graph neural networks that leverage the properties of hyper-complex feature transformation. In particular, in our proposed class of models, the multiplication rule specifying the algebra itself is inferred from the data during training. Given a fixed model architecture, we present empirical evidence that our proposed model incorporates a regularization effect, alleviating the risk of overfitting. We also show that for fixed model capacity, our proposed method outperforms its corresponding real-formulated GNN, providing additional confirmation for the enhanced expressivity of HC embeddings. Finally, we test our proposed hypercomplex GNN on several open graph benchmark datasets and show that our models reach state-of-the-art performance while consuming a much lower memory footprint with 70% fewer parameters. Our implementations are available at https://github.com/bayer-science-for-a-better-life/phc-gnn.
More
Translated text
Key words
Graph neural networks, Graph representation learning, Graph classification
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined