Federated Learning with Partial Gradients Over-the-Air

2023 20TH ANNUAL IEEE INTERNATIONAL CONFERENCE ON SENSING, COMMUNICATION, AND NETWORKING, SECON(2023)

引用 0|浏览0
暂无评分
摘要
We develop a theoretical framework to study the training of federated learning models with partial gradients via over-the-air computing. The system consists of an edge server and multiple clients, aiming to collaboratively minimize a global loss function. The clients conduct local training and upload the intermediate parameters (e.g. the gradients) by analog transmissions. Specifically, each client modulates the entries of its local gradient onto a set of common orthogonal waveforms and sends out the signal simultaneously to the edge server; owing to the limited number of orthogonal waveforms, only a subset of the parameters can be selected for uploading during each round of communication. On the server side, it passes the received analog signal to a bank of match filters and obtains a noisy partial gradient vector. The server then uses this partial gradient to update the global parameter and feeds the new model back to all the clients for another round of local training. We derive the convergence rate of such a model training algorithm. We also conduct experiments to investigate the effects of different masking schemes on the convergence performance. The findings advance the understanding of over-the-air federated learning and provide useful insights for system designs.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要