Timezone: »

 
Spotlight
DisPFL: Towards Communication-Efficient Personalized Federated Learning via Decentralized Sparse Training
Rong Dai · Li Shen · Fengxiang He · Xinmei Tian · Dacheng Tao

Wed Jul 20 10:20 AM -- 10:25 AM (PDT) @ Ballroom 1 & 2

Personalized federated learning is proposed to handle the data heterogeneity problem amongst clients by learning dedicated tailored local models for each user. However, existing works are often built in a centralized way, leading to high communication pressure and high vulnerability when a failure or an attack on the central server occurs. In this work, we propose a novel personalized federated learning framework in a decentralized (peer-to-peer) communication protocol named DisPFL, which employs personalized sparse masks to customize sparse local models on the edge. To further save the communication and computation cost, we propose a decentralized sparse training technique, which means that each local model in DisPFL only maintains a fixed number of active parameters throughout the whole local training and peer-to-peer communication process. Comprehensive experiments demonstrate that DisPFL significantly saves the communication bottleneck for the busiest node among all clients and, at the same time, achieves higher model accuracy with less computation cost and communication rounds. Furthermore, we demonstrate that our method can easily adapt to heterogeneous local clients with varying computation complexities and achieves better personalized performances.

Author Information

Rong Dai (University of Science and Technology of China)
Li Shen (JD Explore Academy)
Fengxiang He (JD.com Inc)
Fengxiang He

Fengxiang He received his BSc in statistics from University of Science and Technology of China and MPhil and PhD in computer science from the University of Sydney. He is currently an algorithm scientist at JD Explore Academy, JD.com Inc, leading its trustworthy AI team. His research interest is theory and practice of trustworthy AI, including deep learning theory, privacy preservation, and fairness. He has published in top conferences and journals, including ICML, NeurIPS, ICLR, CVPR, ICCV, UAI, AAAI, IJCAI, TNNLS, TCSVT, TMM, and Neural Computation.

Xinmei Tian (University of Science and Technology of China)
Dacheng Tao

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors