Power of Redundancy: Surplus Client Scheduling for Federated Learning Against User Uncertainties

IEEE Transactions on Mobile Computing(2023)

引用 11|浏览25
暂无评分
摘要
Federated learning (FL) has reshaped the learning paradigm by overcoming privacy concerns and siloed data issues. In FL, an aggregator schedules a set of mobile users (MUs) to collectively train a global model with their local datasets and subsequently aggregates their model updates to obtain a new global model. However, the users have many uncertainties like unstable network connections and volatile availability, which leads to the straggler problem and deteriorates the efficiency of the FL system. Besides, the issue of non-IID datasets hinders the convergence performance of the global model. To hurdle the user uncertainties, we associate a deadline with the decision in each round and partially collect MUs' updates after the deadline, which can be achieved by considering surplus budget constraints. Moreover, we introduce fairness constraints for the non-IID issue where we ensure that all MUs have chances to be scheduled each round but the MUs with large and diverse local datasets will preferentially be selected. We propose a deadline-aware task replication for surplus client scheduling policy, called FEDDATE-CS. FEDDATE-CS is developed based on a novel contextual-combinatorial multi-armed bandit (CCMAB) learning framework with fairness guarantee. We extend the hypercube-based CCMAB framework by integrating the Lyapunov queuing technique and rigorously prove that FEDDATE-CS achieves a sublinear regret bound when learning the optimal client scheduling solution under uncertainties. Moreover, our FEDDATE-CS provides an [0(1=V); 0(V)] regret-fairness tradeoff for any fairness control factor V > 0. We conduct extensive evaluations to verify the significant superiority of FEDDATE-CS over benchmarks.
更多
查看译文
关键词
Federated learning,client scheduling,task replication,bandits
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要