Bayes-inspired Integration of Pretrained Priors and Few-Shot Evidence for Few-Shot Classification
Abstract
Few-shot classification aims to adapt a pretrained model to novel classes with limited examples. While current methods often heuristically combine pretrained knowledge and few-shot evidence, we seek a more principled understanding of their relationship. In this paper, we propose a Bayesian-inspired optimal integration framework(BOIF) that interprets pretrained models as priors and few-shot evidence as likelihoods. Under conditional independence approximation, we show that the optimal log-posterior decomposes into the sum of prior logits and likelihood logits. This leads to a simple yet effective design principle: decouple the prior and likelihood pathways and combine their logits additively. Guided by this principle, we implement BOIF using CLIP with two novel enhancements: (1) a multi-level feature adapter to enrich visual representations, and (2) a simplified cache module for likelihood estimation. Extensive experiments on 11 benchmarks show BOIF achieves state-of-the-art performance (e.g., 80.61\% average accuracy at 16-shot) and strong out-of-distribution robustness. Our work provides both a principled perspective and an effective instantiation for few-shot adaptation.