Skip to yearly menu bar Skip to main content


Oral

Semi-Cyclic Stochastic Gradient Descent

Hubert Eichner · Tomer Koren · Brendan McMahan · Nati Srebro · Kunal Talwar

[ ] [ Visit Optimization ]
[ Slides [ Oral

Abstract:

We consider convex SGD updates with a blockcyclic structure, i.e. where each cycle consists of a small number of blocks, each with many samples from a possibly different, block-specific, distribution. This situation arises, e.g., in Federated Learning where the mobile devices available for updates at different times during the day have different characteristics. We show that such block-cyclic structure can significantly deteriorate the performance of SGD, but propose a simple correction approach that allows prediction with the same performance guarantees as for i.i.d., non-cyclic, sampling.

Chat is not available.