FedVeer: Self-Adaptive Skew Estimation for Robust Federated Learning
Yun Xin ⋅ Bangqi Pan ⋅ Jianfeng Lu ⋅ Shuqin Cao ⋅ Gang Li ⋅ Guanghui Wen
Abstract
Federated Learning (FL) enables collaborative model training across decentralized clients, but its performance often degrades under non-IID data distributions, particularly in the presence of data skew. Existing approaches mitigate this issue by estimating client skew via kernel density estimation over neighboring model updates, which preserves privacy and reduces communication costs. However, such approaches suffer from two fundamental limitations: bias toward skewed majority clients due to fixed neighborhood structures, and vulnerability to noise-induced perturbation in kernel space. To address these challenges, we propose FedVeer, a skew-aware FL framework based on self-adaptive kernel density estimation with $k$-free neighborhoods. FedVeer dynamically determines the neighborhood size via max-margin learning to mitigate majority-client bias, and further incorporates Kalman filtering to stabilize margin estimation under noisy updates, with a high-probability theoretical guarantee on margin deviation. Extensive experiments on real-world datasets demonstrate that FedVeer consistently outperforms four baselines, achieving up to 6.36\% accuracy improvement and reducing noise-induced degradation by up to 6.01\%.
Successful Page Load