Reschedule Gradients: Temporal Non-IID Resilient Federated Learning

IEEE Internet of Things Journal(2023)

引用 7|浏览19
暂无评分
摘要
Federated learning is a popular framework designed to perform the distributed machine learning while protecting client privacy. However, the heterogeneous data distribution in real-world environments makes it difficult to converge when performing model training. In this article, we propose the federated gradient scheduling (FedGS), an improved historical gradient sampling utilization method for optimizers that utilize historical gradients in the federated learning to alleviate the instability problem of historical gradient information due to non-IID. FedGS consists of two main steps to improve federated learning performance. First, clustering uses clients’ label distributions, which relabel clients and their submitting gradients. Second, sampling gradient clusters to generate an IID gradient set, which feeds to optimizers to derive valid momentum information. Besides, we introduce differential privacy to collaborate with FedGS to enhance clients’ privacy protection strength. Compared to previous non-IID federated learning solutions, our method can achieve greater resistance to temporal non-IID. Moreover, experiments show that FedGS can achieve faster convergence and performance gain of up to 10% over existing state-to-art methods in some scenarios. FedGS can combine with existing methods easily to achieve better performance. We further verify that our method has robust performance gains in different non-IID scenarios, demonstrating the adaptability of FedGS for different scenarios.
更多
查看译文
关键词
Data privacy,deep learning,differential privacy,federated learning,non-IID,optimizer
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要