Timezone: »

Dynamic Regularized Sharpness Aware Minimization in Federated Learning: Approaching Global Consistency and Smooth Landscape
Yan Sun · Li Shen · Shixiang Chen · Liang Ding · Dacheng Tao

Thu Jul 27 06:40 PM -- 06:48 PM (PDT) @ Meeting Room 313
In federated learning (FL), a cluster of local clients are chaired under the coordination of the global server and cooperatively train one model with privacy protection. Due to the multiple local updates and the isolated non-iid dataset, clients are prone to overfit into their own optima, which extremely deviates from the global objective and significantly undermines the performance. Most previous works only focus on enhancing the consistency between the local and global objectives to alleviate this prejudicial client drifts from the perspective of the optimization view, whose performance would be prominently deteriorated on the high heterogeneity. In this work, we propose a novel and general algorithm FedSMOO by jointly considering the optimization and generalization targets to efficiently improve the performance in FL. Concretely, FedSMOO adopts a dynamic regularizer to guarantee the local optima towards the global objective, which is meanwhile revised by the global Sharpness Aware Minimization (SAM) optimizer to search for the consistent flat minima. Our theoretical analysis indicates that FedSMOO achieves fast $\mathcal{O}(1/T)$ convergence rate with low generalization bound. Extensive numerical studies are conducted on the real-world dataset to verify its peerless efficiency and excellent generality.

Author Information

Yan Sun (The University of Sydney)

PhD student (University of Sydney)

Li Shen (JD Explore Academy)
Shixiang Chen (University of Science and Technology of China)
Liang Ding (JD Explore Academy, JD.com Inc.)
Dacheng Tao

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors