Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Federated Learning and Analytics in Practice: Algorithms, Systems, Applications, and Opportunities

SCAFF-PD: Communication Efficient Fair and Robust Federated Learning

Yaodong Yu · Sai Praneeth Karimireddy · Yi Ma · Michael Jordan


Abstract:

We present SCAFF-PD, a fast and communication-efficient algorithm for distributionally robust federated learning. Our approach improves fairness by optimizing a family of distributionally robust objectives tailored to heterogeneous clients. We leverage the special structure of these objectives, and design an accelerated primal dual (APD) algorithm which uses bias corrected local steps (as in {\sc Scaffold}) to achieve significant gains in communication efficiency and convergence speed. We evaluate SCAFF-PD on several benchmark datasets and demonstrate its effectiveness in improving fairness and robustness while maintaining competitive accuracy. Our results suggest that SCAFF-PD is a promising approach for federated learning in resource-constrained and heterogeneous settings.

Chat is not available.