Timezone: »
In this paper, we introduce several auto-encoder models that preserve local distances in the latent space. We use a local distance preserving loss that is based on the continuous k-nearest neighbour graph which is known to capture topological features at all scales simultaneously. To improve training performance, we formulate learning as a constraint optimisation problem with local distance preservation as the main objective and reconstruction accuracy as a constraint. We generalise this approach to hierarchical variational auto-encoders thus learning generative models with geometrically consistent latent and data spaces. Our method provides state-of-the-art performance across several standard datasets and evaluation metrics.
Author Information
Nutan Chen (Machine Learning Research Lab, Volkswagen group)
Patrick van der Smagt (Volkswagen Group)
Botond Cseke (Volkswagen Group)
More from the Same Authors
-
2021 : Exploration via Empowerment Gain: Combining Novelty, Surprise and Learning Progress »
Philip Becker-Ehmck · Maximilian Karl · Jan Peters · Patrick van der Smagt -
2022 : Probabilistic Dalek - Emulator framework with probabilistic prediction for supernova tomography »
Wolfgang Kerzendorf · Nutan Chen · Patrick van der Smagt -
2020 Poster: Learning Flat Latent Manifolds with VAEs »
Nutan Chen · Alexej Klushyn · Francesco Ferroni · Justin Bayer · Patrick van der Smagt -
2019 Poster: Switching Linear Dynamics for Variational Bayes Filtering »
Philip Becker-Ehmck · Jan Peters · Patrick van der Smagt -
2019 Oral: Switching Linear Dynamics for Variational Bayes Filtering »
Philip Becker-Ehmck · Jan Peters · Patrick van der Smagt