Timezone: »

FL-NTK: A Neural Tangent Kernel-based Framework for Federated Learning Analysis
Baihe Huang · Xiaoxiao Li · Zhao Song · Xin Yang

Wed Jul 21 09:00 PM -- 11:00 PM (PDT) @ None #None

Federated Learning (FL) is an emerging learning scheme that allows different distributed clients to train deep neural networks together without data sharing. Neural networks have become popular due to their unprecedented success. To the best of our knowledge, the theoretical guarantees of FL concerning neural networks with explicit forms and multi-step updates are unexplored. Nevertheless, training analysis of neural networks in FL is non-trivial for two reasons: first, the objective loss function we are optimizing is non-smooth and non-convex, and second, we are even not updating in the gradient direction. Existing convergence results for gradient descent-based methods heavily rely on the fact that the gradient direction is used for updating. The current paper presents a new class of convergence analysis for FL, Federated Neural Tangent Kernel (FL-NTK), which corresponds to overparamterized ReLU neural networks trained by gradient descent in FL and is inspired by the analysis in Neural Tangent Kernel (NTK). Theoretically, FL-NTK converges to a global-optimal solution at a linear rate with properly tuned learning parameters. Furthermore, with proper distributional assumptions, FL-NTK can also achieve good generalization. The proposed theoretical analysis scheme can be generalized to more complex neural networks.

Author Information

Baihe Huang (Peking University)
Xiaoxiao Li (The University of British Columbia)
Zhao Song (UT-Austin & University of Washington)
Xin Yang (University of Washington)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors