FedBKD: Heterogenous Federated Learning via Bidirectional Knowledge Distillation for Modulation Classification in IoT-Edge System

IEEE Journal of Selected Topics in Signal Processing(2023)

引用 7|浏览27
暂无评分
摘要
Benefit from the rapid evolution of artificial intelligence and wireless communication technology, diverse Internet of Things (IoT) devices with edge computing ability have widely penetrated every aspect of daily human life. However, the deviations of private datasets and the heterogeneity of local models caused by the difference in device composition and application scenarios have hampering the aggregation of global recognition model in modulation classification task, thus constraining the classification performance of intelligent IoT-edge devices severely. To address this problem, we propose a heterogenous Federated learning framework based on Bidirectional Knowledge Distillation (FedBKD) for IoT system, which integrates knowledge distillation into the local model upload (client-to-cloud) and global model download (cloud-to-client) steps of federated learning. The client-to-cloud distillation is regarded as a process of multi-teacher knowledge distillation and the global network is regarded as a student network that unifies the heterogeneous knowledge from multiple local teacher networks. A public dataset is generated by conditional variational autoencoder (CVAE) and stored in the cloud server for supporting the obtaining of heterogeneous knowledge without sharing the private data of IoT devices. The cloud-to-client distillation is single-teacher-multiple-students process, which distills the knowledge from the single global model back to multiple heterogeneous local networks and partial knowledge distillation is used in this process. We implement our FedBKD method in the modulation classification task and the simulation results have proven the effectiveness of our proposed method.
更多
查看译文
关键词
IoT,federated learning,model heterogeneity,knowledge distillation,conditional variational autoencoder
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要