SpeedCP: Fast Kernel-based Conditional Conformal Prediction
Abstract
Conformal prediction provides distribution-free prediction sets with finite-sample conditional guarantees. RKHS-based frameworks—while promising for complex covariate shifts—suffer from prohibitive computational costs. To guarantee conditional validity under such shifts while ensuring feasibility, we build upon the framework of (Gibbs et al., 2023) by introducing a stable and efficient algorithm that computes the full solution path of the regularized RKHS conformal optimization problem, at essentially the same cost as a single kernel quantile fit. Our approach provides simultaneous hyperparameter tuning which provides smoothness control and data-adaptive calibration. To extend the method to high-dimensional settings, we further integrate our approach with low-rank latent embeddings that capture conditional validity in a data-driven latent space. Empirically, our method provides reliable conditional coverage across a variety of modern black-box predictors, improving the interval length of (Gibbs et al., 2023) by 30%, while achieving a 40-fold speedup.