Exploring effective knowledge distillation for tiny object detection

2023 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP(2023)

引用 0|浏览6
暂无评分
摘要
Detecting tiny objects is a long-standing and critical problem in object detection, with broad real-world applications such as autonomous driving, surveillance, and medical diagnosis. Recent studies for tiny object detection often cause extra computational costs during inference due to introducing feature maps with increased resolution or additional network modules. This scarifies the inference speed for better detection accuracy and may heavily limit their availability to real-world applications. Therefore, this paper turns to knowledge distillation to improve the representation learning of a small model regarding both superior detection accuracy and fast inference speed. The masked scale-aware feature distillation and local attention distillation are proposed to address the critical issues in the distillation of tiny objects. Experimental results on two tiny benchmarks indicate that our method can bring noticeable performance gains to different detectors while keeping their original inference speeds. Our method also shows competitive performance compared to state-of-the-art methods for tiny object detection. Our code is available at https://github.com/haotianll/TinyKD.
更多
查看译文
关键词
tiny object detection,object detection,knowledge distillation,representation learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要