Timezone: »

 
Poster
DVAE++: Discrete Variational Autoencoders with Overlapping Transformations
Arash Vahdat · William Macready · Zhengbing Bian · Amir Khoshaman · Evgeny Andriyash

Fri Jul 13 09:15 AM -- 12:00 PM (PDT) @ Hall B #85

Training of discrete latent variable models remains challenging because passing gradient information through discrete units is difficult. We propose a new class of smoothing transformations based on a mixture of two overlapping distributions, and show that the proposed transformation can be used for training binary latent models with either directed or undirected priors. We derive a new variational bound to efficiently train with Boltzmann machine priors. Using this bound, we develop DVAE++, a generative model with a global discrete prior and a hierarchy of convolutional continuous variables. Experiments on several benchmarks show that overlapping transformations outperform other recent continuous relaxations of discrete latent variables including Gumbel-Softmax (Maddison et al., 2016; Jang et al., 2016), and discrete variational autoencoders (Rolfe 2016).

Author Information

Arash Vahdat (Quadrant.ai, D-Wave)
William Macready (D-Wave)
Zhengbing Bian (Quadrant.ai, D-Wave Systems Inc.)
Amir Khoshaman (D-Wave systems Inc)
Evgeny Andriyash (D-Wave)

Related Events (a corresponding poster, oral, or spotlight)