Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Spurious correlations, Invariance, and Stability (SCIS)

Towards Multi-level Fairness and Robustness on Federated Learning

Fengda Zhang · Kun Kuang · Yuxuan Liu · Long Chen · Jiaxun Lu · Yunfeng Shao · Fei Wu · Chao Wu · Jun Xiao

Keywords: [ robustness ] [ Fairness ] [ Federated Optimization ] [ federated learning ]


Abstract:

Federated learning (FL) has emerged as an important machine learning paradigm where a global model is trained based on the private data from distributed clients. However, federated model can be biased due to the spurious correlation or distribution shift over subpopulations, and it may disproportionately advantage or disadvantage some of the subpopulations, leading to the problem of unfarness and non-robustness. In this paper, we formulate the problem of multi-level fairness and robustness on FL to train a global model performing well on existing clients, different subgroups formed by sensitive attribute(s), and newly added clients at the same time. To solve this problem, we propose a unifed optimization objective from the view of federated uncertainty set with theoretical analyses. We also develop an effcient federated optimization algorithm named Federated Mirror Descent Ascent with Momentum Acceleration (FMDA-M) with convergence guarantee. Extensive experimental results show that FMDA-M outperforms the existing FL algorithms on multilevel fairness and robustness.

Chat is not available.