Skip to yearly menu bar Skip to main content


Poster

Zeno++: Robust Fully Asynchronous SGD

Cong Xie · Sanmi Koyejo · Indranil Gupta

Keywords: [ Robust Statistics and Machine Learning ] [ Trustworthy Machine Learning ] [ Safety ]


Abstract:

We propose Zeno++, a new robust asynchronous Stochastic Gradient Descent(SGD) procedure, intended to tolerate Byzantine failures of workers. In contrast to previous work, Zeno++ removes several unrealistic restrictions on worker-server communication, now allowing for fully asynchronous updates from anonymous workers, for arbitrarily stale worker updates, and for the possibility of an unbounded number of Byzantine workers. The key idea is to estimate the descent of the loss value after the candidate gradient is applied, where large descent values indicate that the update results in optimization progress. We prove the convergence of Zeno++ for non-convex problems under Byzantine failures. Experimental results show that Zeno++ outperforms existing Byzantine-tolerant asynchronous SGD algorithms.

Chat is not available.