Skip to yearly menu bar Skip to main content


Poster

FedCal: Achieving Local and Global Calibration in Federated Learning via Aggregated Parameterized Scaler

Hongyi Peng · Han Yu · Xiaoli Tang · Xiaoxiao Li


Abstract:

Federated learning (FL) has enabled collaborative machine learning across distributed data owners (a.k.a., FL clients). Data heterogeneity is an important challenge facing FL in practice. While prior research generally focused on improving the accuracy and convergence in FL models in the face of non-iid data through a diverse range of techniques, the approach of model calibration remains under-explored. Calibration ensures that the confidence of a model in predictions aligns with the ground truth. Yet, the inherent data heterogeneity and pressing privacy concerns associated with FL make existing centralized model calibration methods inapplicable. Our study uncovers that the existing FL model aggregation approach might lead to sub-optimal model calibration. To address this issue, we propose a novel Federated Calibration (FedCal) approach. emphasizing on both local and global calibration. It leverages client-specific scalers for local calibration to effectively correct output misalignment without sacrificing prediction accuracy. These scalers are then aggregated via weight averaging to generate a global scaler, minimizing the global calibration error. Theoretical analysis shows that despite constraining the variance in clients’ label distributions, the global model calibration error still asymptotically decreases. Extensive experiments on four benchmark datasets demonstrate significant advantages of FedCal over five state-of-the-art methods, reducing the global model calibration error by 47.66% on average compared to the best-performing baseline.

Live content is unavailable. Log in and register to view live content