Slingshot: Globally Favorable Local Updates for Federated Learning

Jialiang Liu,Huawei Huang, Chun Wang,Sicong Zhou, Ruixin Li,Zibin Zheng

IEEE OPEN JOURNAL OF THE COMPUTER SOCIETY(2024)

引用 0|浏览7
暂无评分
摘要
Federated Learning (FL), as a promising distributed learning paradigm, is proposed to solve the contradiction between the data hunger of modern machine learning and the increasingly stringent need for data privacy. However, clients naturally present different distributions of their local data and inconsistent local optima, which leads to poor model performance of FL. Many previous methods focus on mitigating objective inconsistency. Although local objective consistency can be guaranteed when the number of communication rounds is infinite, we should notice that the accumulation of global drift and the limitation on the potential of local updates are non-negligible in those previous methods. In this article, we study a new framework for data-heterogeneity FL, in which the local updates in clients towards the global optimum can accelerate FL. We propose a new approach called Slingshot. Slingshot's design goals are twofold, i.e., i) to retain the potential of local updates, and ii) to combine local and global trends. Experimental results show that Slingshot helps local updates become more globally favorable and outperforms other popular methods under various FL settings. For example, on CIFAR10, Slingshot achieves 46.52% improvement in test accuracy and 48.21x speedup for a lightweight neural network named SqueezeNet.
更多
查看译文
关键词
Data models,Servers,Federated learning,Computational modeling,Measurement,Training,Task analysis,data heterogeneity,catastrophic forgetting,model performance
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要