Poster
Federated Learning with Label Distribution Skew via Logits Calibration
Jie Zhang · Zhiqi Li · Bo Li · Jianghe Xu · Shuang Wu · Shouhong Ding · Chao Wu
Hall E #721
Keywords: [ DL: Algorithms ] [ APP: Computer Vision ] [ OPT: Optimization and Learning under Uncertainty ] [ OPT: Large Scale, Parallel and Distributed ]
Traditional federated optimization methods perform poorly with heterogeneous data (i.e.\ , accuracy reduction), especially for highly skewed data. In this paper, we investigate the label distribution skew in FL, where the distribution of labels varies across clients. First, we investigate the label distribution skew from a statistical view. We demonstrate both theoretically and empirically that previous methods based on softmax cross-entropy are not suitable, which can result in local models heavily overfitting to minority classes and missing classes. Additionally, we theoretically introduce a deviation bound to measure the deviation of the gradient after local update. At last, we propose FedLC (\textbf{Fed}erated learning via \textbf{L}ogits \textbf{C}alibration), which calibrates the logits before softmax cross-entropy according to the probability of occurrence of each class. FedLC applies a fine-grained calibrated cross-entropy loss to local update by adding a pairwise label margin. Extensive experiments on federated datasets and real-world datasets demonstrate that FedLC leads to a more accurate global model and much improved performance. Furthermore, integrating other FL methods into our approach can further enhance the performance of the global model.