Skip to yearly menu bar Skip to main content


Poster

Simultaneous Inference for Massive Data: Distributed Bootstrap

Yang Yu · Shih-Kang Chao · Guang Cheng

Virtual

Keywords: [ Large Scale Learning and Big Data ] [ Parallel and Distributed Learning ] [ Supervised Learning ] [ Optimization - Large Scale, Parallel and Distributed ]


Abstract:

In this paper, we propose a bootstrap method applied to massive data processed distributedly in a large number of machines. This new method is computationally efficient in that we bootstrap on the master machine without over-resampling, typically required by existing methods \cite{kleiner2014scalable,sengupta2016subsampled}, while provably achieving optimal statistical efficiency with minimal communication. Our method does not require repeatedly re-fitting the model but only applies multiplier bootstrap in the master machine on the gradients received from the worker machines. Simulations validate our theory.

Chat is not available.