Timezone: »

 
Oral
Adaptive Antithetic Sampling for Variance Reduction
Hongyu Ren · Shengjia Zhao · Stefano Ermon

Thu Jun 13 10:10 AM -- 10:15 AM (PDT) @ Room 101

Variance reduction techniques are crucial in stochastic estimation and optimization problems. Antithetic sampling techniques reduce the variance of a Monte Carlo estimator by drawing correlated, rather than independent, samples. Designing the right correlation structure, however, is challenging and application specific, thus limiting the practical applicability of these methods. In this paper, we propose a general-purpose adaptive antithetic sampling framework. We leverage advances in generative models and stochastic computation graphs to define a flexible family of antithetic samplers. We provide gradient-based and gradient-free methods to train the samplers such that they reduce variance while ensuring that the underlying Monte Carlo estimator is provably unbiased. We demonstrate the effectiveness of our approach on Bayesian inference and generative model training tasks, where it reduces variance and improves task performance with little or no computational overhead.

Author Information

Hongyu Ren (Stanford University)
Shengjia Zhao (Stanford University)
Stefano Ermon (Stanford University)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors