Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Differentiable Almost Everything: Differentiable Relaxations, Algorithms, Operators, and Simulators

Energy-based Hopfield Boosting for Out-of-Distribution Detection

Claus Hofmann · Simon Schmid · Bernhard Lehner · Daniel Klotz · Sepp Hochreiter

Keywords: [ log-sum-exp ] [ OOD ] [ real softmax ] [ soft nearest neighbor ] [ hopfield ]


Abstract:

Out-of-distribution (OOD) detection is critical when deploying machine learning models in the real world. Outlier exposure (OE) methods, which incorporate auxiliary outlier data (AUX) in the training process, can drastically improve OOD detection performance. We introduce Hopfield Boosting, a boosting approach, which leverages modern Hopfield energy to sharpen the decision boundary between the in-distribution (ID) and OOD data. Hopfield Boosting encourages the model to focus on hard-to-distinguish auxiliary outlier examples that lie close to the decision boundary between ID and AUX data. Our method achieves a new state-of-the-art in OOD detection with OE, improving the FPR95 from 2.28 to 0.92 on CIFAR-10, from 11.24 to 7.94 on CIFAR-100, and from 50.74 to 36.60 on ImageNet-1K.

Chat is not available.