Simultaneous Inference for Massive Data: Distributed Bootstrap
Yang Yu · Shih-Kang Chao · Guang Cheng
Keywords:
Large Scale Learning and Big Data
Parallel and Distributed Learning
Supervised Learning
Optimization - Large Scale, Parallel and Distributed
2020 Poster
Abstract
In this paper, we propose a bootstrap method applied to massive data processed distributedly in a large number of machines. This new method is computationally efficient in that we bootstrap on the master machine without over-resampling, typically required by existing methods \cite{kleiner2014scalable,sengupta2016subsampled}, while provably achieving optimal statistical efficiency with minimal communication. Our method does not require repeatedly re-fitting the model but only applies multiplier bootstrap in the master machine on the gradients received from the worker machines. Simulations validate our theory.
Video
Chat is not available.
Successful Page Load