Clustered Federated Multitask Learning on Non-IID Data With Enhanced Privacy

IEEE Internet of Things Journal(2023)

引用 9|浏览32
暂无评分
摘要
Federated learning is a machine learning prgadigm that enables the collaborative learning among clients while keeping the privacy of clients’ data. Federated multitask learning (FMTL) deals with the statistic challenge of non-independent and identically distributed (IID) data by training a personalized model for each client, and yet requires all the clients to be always online in each training round. To eliminate the limitation of full-participation, we explore multitask learning associated with model clustering, and first propose a clustered FMTL to achieve the multual-task learning on non-IID data, while simultaneously improving the communication efficiency and the model accuracy. To enhance its privacy, we adopt a general dual-server architecture and further propose a secure clustered FMTL by designing a series of secure two-party computation protocols. The convergence analysis and security analysis is conducted to prove the correctness and security of our methods. Numeric evaluation on public data sets validates that our methods are superior to state-of-the-art methods in dealing with non-IID data while protecting the privacy.
更多
查看译文
关键词
Clustering,federated multitask learning,non-independent and identically distributed (IID) data,privacy,secure two-party computation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要