Skip to yearly menu bar Skip to main content


Poster

A Rate-Distortion View of Uncertainty Quantification

Ifigeneia Apostolopoulou · Benjamin Eysenbach · Frank Nielsen · Artur Dubrawski

Hall C 4-9 #1017
[ ] [ Project Page ]
Tue 23 Jul 4:30 a.m. PDT — 6 a.m. PDT

Abstract:

In supervised learning, understanding an input’s proximity to the training data can help a model decide whether it has sufficient evidence for reaching a reliable prediction. While powerful probabilistic models such as Gaussian Processes naturally have this property, deep neural networks often lack it. In this paper, we introduce Distance Aware Bottleneck (DAB), i.e., a new method for enriching deep neural networks with this property. Building on prior information bottleneck approaches, our method learns a codebook that stores a compressed representation of all inputs seen during training. The distance of a new example from this codebook can serve as an uncertainty estimate for the example. The resulting model is simple to train and provides deterministic uncertainty estimates by a single forward pass. Finally, our method achieves better out-of-distribution (OOD) detection and misclassification prediction than prior methods, including expensive ensemble methods, deep kernel Gaussian Processes, and approaches based on the standard information bottleneck.

Live content is unavailable. Log in and register to view live content