Knowledge Distillation for Road Detection based on cross-model Semi-Supervised Learning
CoRR(2024)
摘要
The advancement of knowledge distillation has played a crucial role in
enabling the transfer of knowledge from larger teacher models to smaller and
more efficient student models, and is particularly beneficial for online and
resource-constrained applications. The effectiveness of the student model
heavily relies on the quality of the distilled knowledge received from the
teacher. Given the accessibility of unlabelled remote sensing data,
semi-supervised learning has become a prevalent strategy for enhancing model
performance. However, relying solely on semi-supervised learning with smaller
models may be insufficient due to their limited capacity for feature
extraction. This limitation restricts their ability to exploit training data.
To address this issue, we propose an integrated approach that combines
knowledge distillation and semi-supervised learning methods. This hybrid
approach leverages the robust capabilities of large models to effectively
utilise large unlabelled data whilst subsequently providing the small student
model with rich and informative features for enhancement. The proposed
semi-supervised learning-based knowledge distillation (SSLKD) approach
demonstrates a notable improvement in the performance of the student model, in
the application of road segmentation, surpassing the effectiveness of
traditional semi-supervised learning methods.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要