Skip to yearly menu bar Skip to main content


Poster

Ensemble Pruning under Distribution Shifts

Fengchun Qiao · Xi Peng


Abstract:

Ensemble of deep neural networks has achieved great success in hedging against single-model failure under distribution shift. However, existing techniques suffer from producing redundant models, limiting predictive diversity and yielding compromised generalization performance. Existing ensemble pruning methods can only guarantee predictive diversity for in-distribution data, which may not transfer well to out-of-distribution (OoD) data.To address this gap, we propose a principled optimization framework for ensemble pruning under distribution shifts.Since the annotations of test data are not available, we explore relationships between prediction distributions of the models, encapsulated in a model topology graph.By incorporating this topology into a combinatorial optimization framework, complementary models with high predictive diversity are selected with theoretical guarantees.Our approach is model-agnostic and can be applied on top of a broad spectrum of off-the-shelf ensembling methods for improved generalization performance.Extensive experiments on common benchmarks demonstrate the superiority of our approach in both multi- and single-source OoD generalization.

Live content is unavailable. Log in and register to view live content