Handling Over-Smoothing and Over-Squashing in Graph Convolution with Maximization Operation

IEEE transactions on neural networks and learning systems(2024)

Cited 0|Views2
No score
Abstract
Recent years have witnessed the great success of the applications of graph convolutional networks (GCNs) in various scenarios. However, due to the challenging over-smoothing and over-squashing problems, the ability of GCNs to model information from long-distance nodes has been largely limited. One solution is to aggregate features from different hops of neighborhoods with a linear combination of them followed by a shallow feature transformation. However, we demonstrate that those methods can only achieve a tradeoff between tackling those two problems. To this end, in this article, we design a simple yet effective graph convolution (GC), named maximization-based GC (MGC). Instead of using the linear combination, MGC applies an elementwise maximizing operation for exploiting all possible powers of the normalized adjacent matrix to construct a GC operation. As evidenced by theoretical and empirical analysis, MGC can effectively handle the above two problems. Besides, an efficient approximated model with a linear complexity is developed to extend MGC for large-scale graph learning. To demonstrate the effectiveness, scalability, and efficiency of our models, extensive experiments have been conducted on various benchmark datasets. In particular, our models achieve competitive performance with lower complexity, even on large graphs with more than 100M nodes. Our code is available at https://github.com/SmilesDZgk/MGC.
More
Translated text
Key words
Graph convolutional network (GCN),node classification,over-smoothing,over-squashing
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined