Multi-Branch Feature Learning Network via Global-Local Self-Distillation for Vehicle Re-Identification

IEEE Transactions on Vehicular Technology(2024)

引用 0|浏览3
暂无评分
摘要
Vehicle re-identification (Re-ID) that aims to match vehicles across multiple different cameras with non-overlapping views has wide applications in cooperative vehicle infrastructure systems (CVIS), e.g., determining the identity of vehicle images captured by cameras respectively installed on intelligent vehicles (IVs) and roadside units (RSUs). A crucial challenge is to extract abundant and discriminative features from vehicle images without labor-intensive and time-consuming manual annotation. In light of this, we propose a novel global-local self-distillation-based multi-branch network capable of learning effective feature representation. Specifically, we first present a partially shared backbone network that allows better flexibility for multi-branch feature learning. Then, we apply channel partition to generate multiple local feature maps as complementary to the global feature map. More importantly, we devise a novel global-local self-distillation method that can transfer knowledge in global feature map to local feature maps as supervision signals. Finally, we train the network by optimizing the loss functions accounting for vehicle Re-ID and self-distillation tasks jointly. Extensive experiments on two public vehicle Re-ID datasets demonstrate that our model achieves 83.5% mAP on VeRi-776 and 91.8% mAP on VehicleID without the assistance of extra detection or segmentation networks.
更多
查看译文
关键词
Vehicle re-identification,attention mechanism,knowledge distillation,multi-branch feature learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要