Timezone: »

Rate-distortion optimization guided autoencoder for isometric embedding in Euclidean latent space
Keizo Kato · Jing Zhou · Tomotake Sasaki · Akira Nakagawa

Tue Jul 14 06:00 PM -- 06:45 PM & Wed Jul 15 04:00 AM -- 04:45 AM (PDT) @

To analyze high-dimensional and complex data in the real world, deep generative models such as variational autoencoder (VAE) embed data in a reduced dimensional latent space and learn the probabilistic model in the latent space. However, they struggle to reproduce the probability distribution function (PDF) in the input space from that of the latent space accurately. If the embedding were isometric, this problem can be solved since PDFs in both spaces become proportional. To achieve isometric property, we propose Rate-Distortion Optimization guided autoencoder inspired by orthonormal transform coding. We show our method has the following properties: (i) the columns of the Jacobian matrix between two spaces is constantly-scaled orthonormal system and enable to embed data in latent space isometrically; (ii) the PDF of the latent space is proportional to that of the data observation space. Furthermore, our method outperforms state-of-the-art methods in unsupervised anomaly detection with four public datasets.

Author Information

Keizo Kato (Fujitsu Laboratories Ltd.)
Jing Zhou (Alibaba)
Tomotake Sasaki (Fujitsu Laboratories Ltd.)
Akira Nakagawa (Fujitsu Laboratories Ltd.)

More from the Same Authors