Skip to yearly menu bar Skip to main content


Rate-distortion optimization guided autoencoder for isometric embedding in Euclidean latent space

Keizo Kato · Jing Zhou · Tomotake Sasaki · Akira Nakagawa

Keywords: [ Deep Generative Models ] [ Autoencoders ] [ Deep Learning - Generative Models and Autoencoders ]


To analyze high-dimensional and complex data in the real world, deep generative models such as variational autoencoder (VAE) embed data in a reduced dimensional latent space and learn the probabilistic model in the latent space. However, they struggle to reproduce the probability distribution function (PDF) in the input space from that of the latent space accurately. If the embedding were isometric, this problem can be solved since PDFs in both spaces become proportional. To achieve isometric property, we propose Rate-Distortion Optimization guided autoencoder inspired by orthonormal transform coding. We show our method has the following properties: (i) the columns of the Jacobian matrix between two spaces is constantly-scaled orthonormal system and enable to embed data in latent space isometrically; (ii) the PDF of the latent space is proportional to that of the data observation space. Furthermore, our method outperforms state-of-the-art methods in unsupervised anomaly detection with four public datasets.

Chat is not available.