A New Similarity-Based Relational Knowledge Distillation Method

ICASSP 2024 - 2024 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)(2024)

引用 0|浏览0
暂无评分
摘要
The previous relation-based knowledge distillation methods tend to construct global similarity relationship matrix in a mini-batch while ignoring the knowledge of neighbourhood relationship. In this paper, we propose a new similarity-based relational knowledge distillation method that transfers neighbourhood relationship knowledge by selecting K-nearest neighbours for each sample. Our method consists of two components: Neighbourhood Feature Relationship Distillation and Neighbourhood Logits Relationship Distillation. We perform extensive experiments on CIFAR100 and Tiny ImageNet classification datasets and show that our method outperforms the state-of-the-art knowledge distillation methods. Our code is available at: https://github.com/xinxiaoxiaomeng/NRKD.git.
更多
查看译文
关键词
Knowledge Distillation,Similarity Relationship Knowledge
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要