FEDEMOE: IMPROVING PERSONALIZATION ON HET- EROGENEOUS FEDERATED LEARNING VIA ELASTIC MIXTURE OF EXPERTS ARCHITECTURE
Abstract
Heterogeneous federated learning (HtFL) has emerged as a promising approach to address heterogeneity in local computational resources and data distribution. However, existing methods cause performance degradation of model personalization because personalized and generalized knowledge are either intertwined or dominated by one of them. To address this issue, we propose a novel Elastic Mixture of Experts (EMoE) architecture on HtFL, namely FedEMoE, decoupling personalization from generalization. Specially, FedEMoE employs a multi-scale feature extraction mechanism via personalized experts to enrich personalized knowledge. Furthermore, we design an elastic shared expert to break the transferred knowledge bottleneck across heterogeneous client models. The elastic shared expert can adaptively expand or shrink according to the status of each expert by the weight spectrum analysis, respectively. Extensive experiments across statistical and model heterogeneity settings demonstrate that FedEMoE significantly outperforms state-of-the art methods on the accuracy of each heterogeneous model over diverse datasets.