Timezone: »

Semi-Cyclic Stochastic Gradient Descent
Hubert Eichner · Tomer Koren · Brendan McMahan · Nati Srebro · Kunal Talwar

Thu Jun 13 06:30 PM -- 09:00 PM (PDT) @ Pacific Ballroom #148

We consider convex SGD updates with a block-cyclic structure, i.e., where each cycle consists of a small number of blocks, each with many samples from a possibly different, block-specific, distribution. This situation arises, e.g., in Federated Learning where the mobile devices available for updates at different times during the day have different characteristics. We show that such block-cyclic structure can significantly deteriorate the performance of SGD, but propose a simple approach that allows prediction with the same guarantees as for i.i.d., non-cyclic, sampling.

Author Information

Hubert Eichner (Google)
Tomer Koren (Google Brain)
Brendan McMahan (Google)
Nati Srebro (Toyota Technological Institute at Chicago)
Kunal Talwar (Google)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors