Online Variance Reduction with Mixtures
Zalán Borsos · Sebastian Curi · Yehuda Levy · Andreas Krause

Thu Jun 13th 10:10 -- 10:15 AM @ Room 102

Adaptive importance sampling for stochastic optimization is a promising approach that offers improved convergence through variance reduction. In this work, we propose a new framework for variance reduction that enables the use of mixtures over predefined sampling distributions, which can naturally encode prior knowledge about the data. While these sampling distributions are fixed, the mixture weights are adapted during the optimization process. We propose VRM, a novel and efficient adaptive scheme that asymptotically recovers the best mixture weights in hindsight and can also accommodate sampling distributions over sets of points. We empirically demonstrate the versatility of VRM in a range of applications.

Author Information

Zalán Borsos (ETH Zurich)
Sebastian Curi (ETH)
Yehuda Levy (ETH Zurich)
Andreas Krause (ETH Zurich)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors