Skip to yearly menu bar Skip to main content


Poster

Quantifying and Learning Linear Symmetry-Based Disentanglement

Loek Tonnaer · Luis Armando Perez Rey · Vlado Menkovski · Mike Holenderski · Jacobus Portegies

Hall E #439

Keywords: [ DL: Other Representation Learning ]


Abstract:

The definition of Linear Symmetry-Based Disentanglement (LSBD) formalizes the notion of linearly disentangled representations, but there is currently no metric to quantify LSBD. Such a metric is crucial to evaluate LSBD methods and to compare them to previous understandings of disentanglement. We propose D_LSBD, a mathematically sound metric to quantify LSBD, and provide a practical implementation for SO(2) groups. Furthermore, from this metric we derive LSBD-VAE, a semi-supervised method to learn LSBD representations. We demonstrate the utility of our metric by showing that (1) common VAE-based disentanglement methods don't learn LSBD representations, (2) LSBD-VAE, as well as other recent methods, can learn LSBD representations needing only limited supervision on transformations, and (3) various desirable properties expressed by existing disentanglement metrics are also achieved by LSBD representations.

Chat is not available.