We are concerned with obtaining well-calibrated output distributions from regression models. Such distributions allow us to quantify the uncertainty that the model has regarding the predicted target value. We introduce the novel concept of distribution calibration, and demonstrate its advantages over the existing definition of quantile calibration. We further propose a post-hoc approach to improving the predictions from previously trained regression models, using multi-output Gaussian Processes with a novel Beta link function. The proposed method is experimentally verified on a set of common regression models and shows improvements for both distribution-level and quantile-level calibration.
Hao Song (University of Bristol)
Tom Diethe (Amazon)
Meelis Kull (University of Tartu)
Peter Flach (University of Bristol)
Related Events (a corresponding poster, oral, or spotlight)
2019 Oral: Distribution calibration for regression »
Wed Jun 12th 11:00 -- 11:20 AM Room Room 101