Generating Real-Time Explanations for GNNs via Multiple Specialty Learners and Online Knowledge Distillation.

Tien-Cuong Bui, Van-Duc Le, Wen-Syan Li

IEEE Access(2023)

引用 0|浏览2
暂无评分
摘要
Graph Neural Networks have become increasingly ubiquitous in numerous applications, necessitating explanations of their predictions. However, explaining GNNs is challenging due to the complexity of graph data and model execution. Post-hoc explanation approaches have gained popularity due to their versatility, despite their additional computational costs. Although intrinsically interpretable models can provide instant explanations, they are usually model-specific and can only explain particular GNNs. To address these challenges, we propose a novel, general, and fast GNN explanation framework named SCALE. SCALE trains multiple specialty learners to explain GNNs, as creating a single powerful explainer for examining the attributions of interactions in input graphs is complicated. In training, a black-box GNN model guides learners based on an online knowledge distillation paradigm. During the explanation phase, explanations of predictions are generated by multiple explainers corresponding to trained learners. Edge masking and random walk with restart procedures are implemented to provide structural explanations for graph-level and node-level predictions. A feature attribution module provides overall summaries and instance-level feature contributions. We compare SCALE with state-of-the-art baselines through extensive experiments to demonstrate its explanation correctness and execution performance. Furthermore, we conduct a user study and a series of ablation studies to understand its strengths and weaknesses.
更多
查看译文
关键词
online knowledge distillation,gnns,multiple specialty learners,real-time
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要