谷歌浏览器插件
订阅小程序
在清言上使用

FedBCD: A Communication-Efficient Collaborative Learning Framework for Distributed Features

IEEE transactions on signal processing(2022)

引用 27|浏览51
暂无评分
摘要
We introduce a novel federated learning framework allowing multiple parties having different sets of attributes about the same user to jointly build models without exposing their raw data or model parameters. Conventional federated learning approaches are inefficient for cross-silo problems because they require the exchange of messages for gradient updates at every iteration, and raise security concerns over sharing such messages during learning. We propose a Federated Stochastic Block Coordinate Descent (FedBCD) algorithm, allowing each party to conduct multiple local updates before each communication to effectively reduce communication overhead. Under a practical security model, we show that parties cannot infer others' exact raw data (“ deep leakage ”) from collections of messages exchanged in our framework, regardless of the number of communication to be performed. Further, we provide convergence guarantees and empirical evaluations on a variety of tasks and datasets, demonstrating significant improvement inefficiency.
更多
查看译文
关键词
Stochastic processes,Data models,Collaborative work,Signal processing algorithms,Distributed databases,Security,Data privacy,Federated learning,data privacy,federated stochastic block coordinate descent,cross-silo federated learning,distributed features
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要