Skip to yearly menu bar Skip to main content


Poster

Fast Decision Boundary based Out-of-Distribution Detector

Litian Liu · Yao Qin


Abstract:

Efficient and effective Out-of-Distribution (OOD) detection is essential for the safe deployment of AI.Recently, studies have revealed that detecting OOD based on feature space information can be highly effective. Despite of their effectiveness, however, exiting feature space OOD methods may incur a non-negligible computational overhead, due to their reliance on auxiliary models built from training features.In this paper, we aim to obviate auxiliary models to optimize computational efficiency while leveraging the rich information embedded in the feature space and utilizing class-specific information.Specifically, we investigate from the novel perspective of decision boundaries and propose to detect OOD using the feature distance to decision boundaries. To minimize the cost of measuring the distance, we introduce an efficient closed-form estimation, analytically proven to tightly lower bound the distance.Using our estimation method, we observe that ID features tend to reside further from the decision boundaries than OOD features. From our understanding, we propose a hyperparameter-free, auxiliary model-free OOD detector. Our OOD detector matches or surpasses the effectiveness of state-of-the-art methods across extensive experiments.Meanwhile, our OOD detector incurs practically negligible overhead in inference latency.Overall, we significantly enhance the efficiency-effectiveness trade-off in OOD detection.

Live content is unavailable. Log in and register to view live content