ESFL: Efficient Split Federated Learning over Resource-Constrained Heterogeneous Wireless Devices
CoRR(2024)
摘要
Federated learning (FL) allows multiple parties (distributed devices) to
train a machine learning model without sharing raw data. How to effectively and
efficiently utilize the resources on devices and the central server is a highly
interesting yet challenging problem. In this paper, we propose an efficient
split federated learning algorithm (ESFL) to take full advantage of the
powerful computing capabilities at a central server under a split federated
learning framework with heterogeneous end devices (EDs). By splitting the model
into different submodels between the server and EDs, our approach jointly
optimizes user-side workload and server-side computing resource allocation by
considering users' heterogeneity. We formulate the whole optimization problem
as a mixed-integer non-linear program, which is an NP-hard problem, and develop
an iterative approach to obtain an approximate solution efficiently. Extensive
simulations have been conducted to validate the significantly increased
efficiency of our ESFL approach compared with standard federated learning,
split learning, and splitfed learning.
更多查看译文
关键词
Distributed machine learning,federated learning,split learning,wireless networking
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要