Skip to yearly menu bar Skip to main content


Poster

FedBoost: A Communication-Efficient Algorithm for Federated Learning

Jenny Hamer · Mehryar Mohri · Ananda Theertha Suresh

Keywords: [ General Machine Learning Techniques ] [ Other ] [ Boosting / Ensemble Methods ]


Abstract:

Communication cost is often a bottleneck in federated learning and other client-based distributed learning scenarios. To overcome this, several gradient compression and model compression algorithms have been proposed. In this work, we propose an alternative approach whereby an ensemble of pre-trained base predictors is trained via federated learning. This method allows for training a model which may otherwise surpass the communication bandwidth and storage capacity of the clients to be learned with on-device data through federated learning. Motivated by language modeling, we prove the optimality of ensemble methods for density estimation for standard empirical risk minimization and agnostic risk minimization. We provide communication-efficient ensemble algorithms for federated learning, where per-round communication cost is independent of the size of the ensemble. Furthermore, unlike works on gradient compression, our proposed approach reduces the communication cost of both server-to-client and client-to-server communication.

Chat is not available.