Skip to yearly menu bar Skip to main content


Learning Discrete and Continuous Factors of Data via Alternating Disentanglement

Yeonwoo Jeong · Hyun Oh Song

Pacific Ballroom #5

Keywords: [ Representation Learning ] [ Deep Generative Models ]


We address the problem of unsupervised disentanglement of discrete and continuous explanatory factors of data. We first show a simple procedure for minimizing the total correlation of the continuous latent variables without having to use a discriminator network or perform importance sampling, via cascading the information flow in the beta-VAE framework. Furthermore, we propose a method which avoids offloading the entire burden of jointly modeling the continuous and discrete factors to the variational encoder by employing a separate discrete inference procedure.

This leads to an interesting alternating minimization problem which switches between finding the most likely discrete configuration given the continuous factors and updating the variational encoder based on the computed discrete factors. Experiments show that the proposed method clearly disentangles discrete factors and significantly outperforms current disentanglement methods based on the disentanglement score and inference network classification score. The source code is available at

Live content is unavailable. Log in and register to view live content