Convergence Analysis of Sequential Federated Learning on Heterogeneous Data
arxiv(2023)
摘要
There are two categories of methods in Federated Learning (FL) for joint
training across multiple clients: i) parallel FL (PFL), where clients train
models in a parallel manner; and ii) sequential FL (SFL), where clients train
models in a sequential manner. In contrast to that of PFL, the convergence
theory of SFL on heterogeneous data is still lacking. In this paper, we
establish the convergence guarantees of SFL for strongly/general/non-convex
objectives on heterogeneous data. The convergence guarantees of SFL are better
than that of PFL on heterogeneous data with both full and partial client
participation. Experimental results validate the counterintuitive analysis
result that SFL outperforms PFL on extremely heterogeneous data in cross-device
settings.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要