Timezone: »

 
Poster
Personalized Federated Learning using Hypernetworks
Aviv Shamsian · Aviv Navon · Ethan Fetaya · Gal Chechik

Thu Jul 22 09:00 AM -- 11:00 AM (PDT) @ None #None

Personalized federated learning is tasked with training machine learning models for multiple clients, each with its own data distribution. The goal is to train personalized models collaboratively while accounting for data disparities across clients and reducing communication costs.

We propose a novel approach to this problem using hypernetworks, termed pFedHN for personalized Federated HyperNetworks. In this approach, a central hypernetwork model is trained to generate a set of models, one model for each client. This architecture provides effective parameter sharing across clients while maintaining the capacity to generate unique and diverse personal models. Furthermore, since hypernetwork parameters are never transmitted, this approach decouples the communication cost from the trainable model size. We test pFedHN empirically in several personalized federated learning challenges and find that it outperforms previous methods. Finally, since hypernetworks share information across clients, we show that pFedHN can generalize better to new clients whose distributions differ from any client observed during training.

Author Information

Aviv Shamsian (Bar Ilan University)
Aviv Navon (Bar-Ilan University)
Ethan Fetaya (Bar-Ilan University)
Gal Chechik (NVIDIA / Bar-Ilan University)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors