Addressing Heterogeneity in Federated Learning with Client Selection via Submodular OptimizationJust Accepted

Jinghui Zhang, Jiawei Wang, Yaning Li, Fa Xin,Fang Dong,Junzhou Luo,Zhihua Wu

ACM Transactions on Sensor Networks(2023)

引用 0|浏览2
暂无评分
摘要
Federated learning (FL) has been proposed as a privacy-preserving distributed learning paradigm, which differs from traditional distributed learning in two main aspects: the systems heterogeneity meaning that clients participating in training have significant differences in systems performance including CPU frequency, dataset size and transmission power, and the statistical heterogeneity indicating that the data distribution among clients exhibits Non-Independent Identical Distribution (Non-IID). Therefore, the random selection of clients will significantly reduce the training efficiency of FL. In this paper, we propose a client selection mechanism considering both systems and statistical heterogeneity, which aims to improve the time-to-accuracy performance by trading off the impact of systems performance differences and data distribution differences among the clients on training efficiency. Firstly, client selection is formulated as a combinatorial optimization problem that jointly optimizes systems and statistical performance. Then we generalize it to a submodular maximization problem with knapsack constraint, and propose the I terative G reedy with P artial E numeration (IGPE) algorithm to greedily select the suitable clients. Then, the approximation ratio of IGPE is analyzed theoretically. Extensive experiments verify that the time-to-accuracy performance of the IGPE algorithm outperforms other compared algorithms in a variety of heterogeneous environments.
更多
查看译文
关键词
Federated Learning,Client Selection,Systems Heterogeneity,Statistical Heterogeneity,Submodular Functions.
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要