Hierarchical Importance Weighted Autoencoders
Chin-Wei Huang · Kris Sankaran · Eeshan Dhekane · Alexandre Lacoste · Aaron Courville

Thu Jun 13th 05:05 -- 05:10 PM @ Grand Ballroom

Importance weighted variational inference (Burda et al., 2016) uses multiple i.i.d. samples to have a tighter variational lower bound. We believe a joint proposal has the potential of reducing the number of redundant samples, and introduce a hierarchical structure to induce correlation. The hope is that the proposals would coordinate to make up for the error made by one another, and reduce the variance as a whole. Theoretically, we analyze the condition under which convergence of the estimator variance can be connected to convergence of the lower bound. Empirically, we confirm that maximization of the lower bound does implicitly minimize variance. Further analysis shows that this is a result of negative correlation induced by the proposed hierarchical meta sampling scheme, and performance of inference also improves when number of samples increases.

Author Information

Chin-Wei Huang (MILA)
Kris Sankaran (Mila)
Eeshan Dhekane (MILA, Université de Montréal)
Alexandre Lacoste (Element AI)
Aaron Courville (Université de Montréal)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors