Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Dynamic Neural Networks

FedHeN: Federated Learning in Heterogeneous Networks

Durmus Alp Emre Acar · Venkatesh Saligrama


Abstract:

We propose a novel training recipe for federated learning with heterogeneous networks where each device can have different architectures. We introduce training with a side objective to the devices of higher complexities which allows different architectures to jointly train in a federated setting. We empirically show that our approach improves training of different architectures and leads to high communication savings compared to state-of-the-art methods.

Chat is not available.