Chrome Extension
WeChat Mini Program
Use on ChatGLM

Contractible Regularization for Federated Learning on Non-IID Data

2022 IEEE International Conference on Data Mining (ICDM)(2022)

Cited 2|Views21
No score
Abstract
In the medical domain, gathering all data and training a global supervised model is very difficult due to scattered data from different hospitals and security and privacy concerns. In recent years, several federated learning models have been proposed for training over isolated data. These models usually employ a client-server framework: 1) train local models on clients in parallel; 2) aggregate local models on the server to produce a global one. By iterating the above two steps, federated learning aims to approximate the performance of a model centrally trained on data. However, due to the non-IID data distribution issue, local models could deviate from the optimal model resulting in a biased aggregated global model. To address this problem, we propose a contractible regularization (ConTre) to act on the local model’s latent space. On each client, we first project the input data into a latent space and then pose regularization to avoid converging too fast to bad local optima. The proposed regularization can be easily integrated into existing federated learning frameworks without bringing in additional parameters. According to experimental results on multiple natural and medical image datasets, the proposed ConTre can significantly improve the performance of various federated learning frameworks. Our code is available at https://github.com/czifan/ConTre.pytorch.
More
Translated text
Key words
Federated learning,feature learning,image classification,computer vision
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined