Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Federated Learning and Analytics in Practice: Algorithms, Systems, Applications, and Opportunities

$\texttt{FED-CURE}$: A Robust Federated Learning Algorithm with Cubic Regularized Newton

Avishek Ghosh · Raj Kumar Maity · Arya Mazumdar


Abstract: In this paper, we analyze the cubic-regularized Newton method that avoids saddle points in non-convex optimization in the Federated Learning (FL) framework and simultaneously address several practical challenges that naturally arise in FL, like communication bottleneck and Byzantine attacks. We propose FEDerated CUbic REgularized Newton $(\texttt{FED-CURE})$ and obtain convergence guarantees under several settings. Being a second order algorithm, the iteration complexity of $\texttt{FED-CURE}$ is much lower than its first order counterparts, and furthermore we can use compression (or sparsification) techniques like $\delta$-approximate compression to achieve communication efficiency and norm-based thresholding for Byzantine resilience. We validate the performance of $\texttt{FED-CURE}$ with experiments using standard datasets and several types of Byzantine attacks, and obtain an improvement of $25\%$ with respect to first order methods in total iteration complexity.

Chat is not available.