Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Dynamic Neural Networks

A Product of Experts Approach to Early-Exit Ensembles

James Allingham · Eric Nalisnick


Abstract:

Ensembles are often expensive to evaluate since they require running multiple models—each of which is costly in the case of neural networks. Using ensembles in compute-constrained applications would be much more practical if just a subset of the models could be evaluated. We address this issue with a novel product-of-experts-based method for early-exit ensembling. We rely on the fact that the product of finite-support probability distributions (e.g., the continuous uniform) has support less than or equal to that of the multiplicands. Thus, by setting a confidence threshold, we can stop evaluating ensemble members once the size of the support has been sufficiently reduced. We demonstrate our methodology for both real-value regression and multi-class classification.

Chat is not available.