| | |
| | |
Stat |
Members: 3657 Articles: 2'599'751 Articles rated: 2609
15 October 2024 |
|
| | | |
|
Article overview
| |
|
DisPFL: Towards Communication-Efficient Personalized Federated Learning via Decentralized Sparse Training | Rong Dai
; Li Shen
; Fengxiang He
; Xinmei Tian
; Dacheng Tao
; | Date: |
1 Jun 2022 | Abstract: | Personalized federated learning is proposed to handle the data heterogeneity
problem amongst clients by learning dedicated tailored local models for each
user. However, existing works are often built in a centralized way, leading to
high communication pressure and high vulnerability when a failure or an attack
on the central server occurs. In this work, we propose a novel personalized
federated learning framework in a decentralized (peer-to-peer) communication
protocol named Dis-PFL, which employs personalized sparse masks to customize
sparse local models on the edge. To further save the communication and
computation cost, we propose a decentralized sparse training technique, which
means that each local model in Dis-PFL only maintains a fixed number of active
parameters throughout the whole local training and peer-to-peer communication
process. Comprehensive experiments demonstrate that Dis-PFL significantly saves
the communication bottleneck for the busiest node among all clients and, at the
same time, achieves higher model accuracy with less computation cost and
communication rounds. Furthermore, we demonstrate that our method can easily
adapt to heterogeneous local clients with varying computation complexities and
achieves better personalized performances. | Source: | arXiv, 2206.00187 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
|
| |
|
|
|