HFedMS: Heterogeneous Federated Learning With Memorable Data Semantics in Industrial Metaverse

arxiv(2023)

引用 4|浏览32
暂无评分
摘要
Federated Learning (FL), as a rapidly evolving privacy-preserving collaborative machine learning paradigm, is a promising approach to enable edge intelligence in the emerging Industrial Metaverse. Even though many successful use cases have proved the feasibility of FL in theory, in the industrial practice of Metaverse, the problems of non-independent and identically distributed (non-i.i.d.) data, learning forgetting caused by streaming industrial data, and scarce communication bandwidth remain key barriers to realize practical FL. Facing the above three challenges simultaneously, this article presents a high-performance and efficient system named HFedMS for incorporating practical FL into Industrial Metaverse. HFedMS reduces data heterogeneity through dynamic grouping and training mode conversion ( Dynamic Sequential-to-Parallel Training, STP ). Then, it compensates for the forgotten knowledge by fusing compressed historical data semantics and calibrates classifier parameters ( Semantic Compression and Compensation, SCC ). Finally, the network parameters of the feature extractor and classifier are synchronized in different frequencies ( Layer-wise Alternative Synchronization Protocol, LASP ) to reduce communication costs. These techniques make FL more adaptable to the heterogeneous streaming data continuously generated by industrial equipment, and are also more efficient in communication than traditional methods (e.g., Federated Averaging). Extensive experiments have been conducted on the streamed non-i.i.d. FEMNIST dataset using 368 simulated devices. Numerical results show that HFedMS improves the classification accuracy by at least 6.4% compared with 8 benchmarks and saves both the overall runtime and transfer bytes by up to 98%, proving its superiority in precision and efficiency.
更多
查看译文
关键词
Metaverse, Sensors, Training, Synchronization, Data models, Semantics, Low-power wide area networks, federated learning, stream data, Non-IID, forgetting, communication efficient
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要