Conformal Risk-Averse Decision Making with Action Conditional Guarantee
Abstract
Reliable decision making pipelines powered by machine learning models require uncertainty quantification (UQ) methods that come with explicit safety guarantees. Conformal prediction provides such UQ by wrapping ML predictions into prediction sets, and recent work by \cite{kiyani2025decision} established that these sets can be translated into optimal risk-averse decision policies—yet only inheriting marginal safety guarantees. We generalize and strengthen their results by (i) introducing action-conditional conformal prediction, which yields safety guarantees conditioned explicitly on each action taken by the decision maker, (ii) showing that action-conditional prediction sets serve as a proxy for the feasible decision space for risk-averse decision makers aiming to optimize action-conditional value-at-risk, and (iii) proposing a principled finite-sample algorithm based on pinball-loss minimization, connecting the framework of \cite{gibbs2025conformal} to action-conditional guarantees. Experiments on two real-world datasets confirm that our approach significantly improves action-conditional performance over several conformal baselines.