Timezone: »

 
Poster
High-Quality Prediction Intervals for Deep Learning: A Distribution-Free, Ensembled Approach
Tim Pearce · Alexandra Brintrup · Mohamed Zaki · Andy Neely

Fri Jul 13 09:15 AM -- 12:00 PM (PDT) @ Hall B #8

This paper considers the generation of prediction intervals (PIs) by neural networks for quantifying uncertainty in regression tasks. It is axiomatic that high-quality PIs should be as narrow as possible, whilst capturing a specified portion of data. We derive a loss function directly from this axiom that requires no distributional assumption. We show how its form derives from a likelihood principle, that it can be used with gradient descent, and that model uncertainty is accounted for in ensembled form. Benchmark experiments show the method outperforms current state-of-the-art uncertainty quantification methods, reducing average PI width by over 10%.

Author Information

Tim Pearce (University of Cambridge / The Alan Turing Institute)
Alexandra Brintrup
Mohamed Zaki (University of Cambridge)
Andy Neely

Related Events (a corresponding poster, oral, or spotlight)