谷歌浏览器插件
订阅小程序
在清言上使用

SpanGNN: Towards Memory-Efficient Graph Neural Networks Via Spanning Subgraph Training

Xizhi Gu,Hongzheng Li, Shihong Gao, Xinyan Zhang,Lei Chen,Yingxia Shao

Lecture Notes in Computer Science Machine Learning and Knowledge Discovery in Databases Research Track(2024)

引用 0|浏览13
暂无评分
摘要
Graph Neural Networks (GNNs) have superior capability in learning graph data.Full-graph GNN training generally has high accuracy, however, it suffers fromlarge peak memory usage and encounters the Out-of-Memory problem when handlinglarge graphs. To address this memory problem, a popular solution is mini-batchGNN training. However, mini-batch GNN training increases the training varianceand sacrifices the model accuracy. In this paper, we propose a newmemory-efficient GNN training method using spanning subgraph, called SpanGNN.SpanGNN trains GNN models over a sequence of spanning subgraphs, which areconstructed from empty structure. To overcome the excessive peak memoryconsumption problem, SpanGNN selects a set of edges from the original graph toincrementally update the spanning subgraph between every epoch. To ensure themodel accuracy, we introduce two types of edge sampling strategies (i.e.,variance-reduced and noise-reduced), and help SpanGNN select high-quality edgesfor the GNN learning. We conduct experiments with SpanGNN on widely useddatasets, demonstrating SpanGNN's advantages in the model performance and lowpeak memory usage.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要