Timezone: »
Deep generative models have been demonstrated as state-of-the-art density estimators. Yet, recent work has found that they often assign a higher likelihood to data from outside the training distribution. This seemingly paradoxical behavior has caused concerns over the quality of the attained density estimates. In the context of hierarchical variational autoencoders, we provide evidence to explain this behavior by out-of-distribution data having in-distribution low-level features. We argue that this is both expected and desirable behavior. With this insight in hand, we develop a fast, scalable and fully unsupervised likelihood-ratio score for OOD detection that requires data to be in-distribution across all feature-levels. We benchmark the method on a vast set of data and model combinations and achieve state-of-the-art results on out-of-distribution detection.
Author Information
Jakob D. Havtorn (Technical University of Denmark)
Jes Frellsen (Technical University of Denmark)
Søren Hauberg (Technical University of Denmark)
I was born, and now I exist.
Lars Maaløe (Corti)
Related Events (a corresponding poster, oral, or spotlight)
-
2021 Poster: Hierarchical VAEs Know What They Don’t Know »
Tue. Jul 20th 04:00 -- 06:00 PM Room Virtual
More from the Same Authors
-
2023 : Variational Point Encoding Deformation for Dental Modeling »
Johan Ye · Thomas Ørkild · Peter Søndergard · Søren Hauberg -
2021 Poster: Isometric Gaussian Process Latent Variable Model for Dissimilarity Data »
Martin Jørgensen · Søren Hauberg -
2021 Spotlight: Isometric Gaussian Process Latent Variable Model for Dissimilarity Data »
Martin Jørgensen · Søren Hauberg -
2020 Workshop: Learning with Missing Values »
Julie Josse · Jes Frellsen · Pierre-Alexandre Mattei · Gael Varoquaux -
2020 : Opening Session »
Julie Josse · Jes Frellsen · Pierre-Alexandre Mattei · Gael Varoquaux -
2020 Poster: Variational Autoencoders with Riemannian Brownian Motion Priors »
Dimitris Kalatzis · David Eklund · Georgios Arvanitidis · Søren Hauberg