Training Sparse Graph Neural Networks Via Pruning and Sprouting

PROCEEDINGS OF THE 2024 SIAM INTERNATIONAL CONFERENCE ON DATA MINING, SDM(2024)

引用 0|浏览33
暂无评分
摘要
With the emergence of large-scale graphs and deeper graph neural networks (GNNs), sparsifying GNNs including graph connections and model parameters has attracted a lot of attention. However, most existing GNN sparsification methods apply traditional neural network pruning techniques to sparsify graphs in an iterative cycle (train-then-sparsify), which not only incurs high training costs but also limits model performance. In this paper, we propose a novel Pruning and Sprouting framework for GNN (PSGNN) that not only enhances the efficiency of inference, but also boosts the performance of GNN trained on a core subgraph beyond the original graph. Based on during-training pruning, our framework gradually sparsifies the graph connections and model weights simultaneously. More specifically, PSGNN removes edges in the original graph according to the predicted label similarity between nodes from a global view. Additionally, with our graph sprouting strategy, PSGNN can generate new edges to include important yet missing topological and feature information in the original graph, while maintaining the sparsity of the graph. Extensive experiments on node classification task across different GNN architectures and graph datasets demonstrate that our proposed PSGNN method improves the performance over existing methods while saving training and inference costs.
更多
查看译文
关键词
Graph neural networks,sparse training
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要