Timezone: »
Continuous Vector Quantile Regression
Sanketh Vedula · Irene Tallini · Aviv A. Rosenberg · Marco Pegoraro · Emanuele Rodola · Yaniv Romano · Alexander Bronstein
Event URL: https://openreview.net/forum?id=DUZbGAXcyL »
Vector quantile regression (VQR) estimates the conditional vector quantile function (CVQF), a fundamental quantity which fully represents the conditional distribution of $\mathbf{Y}|\mathbf{X}$. VQR is formulated as an optimal transport (OT) problem between a uniform $\mathbf{U}\sim\mu$ and the target $(\mathbf{X},\mathbf{Y})\sim\nu$, the solution of which is a unique transport map, co-monotonic with $\mathbf{U}$. Recently NL-VQR has been proposed to estimate support non-linear CVQFs, together with fast solvers which enabled the use of this tool in practical applications. Despite its utility, the scalability and estimation quality of NL-VQR is limited due to a discretization of the OT problem onto a grid of quantile levels. We propose a novel _continuous_ formulation and parametrization of VQR using partial input-convex neural networks (PICNNs). Our approach allows for accurate, scalable, differentiable and invertible estimation of non-linear CVQFs.We further demonstrate, theoretically and experimentally, how continuous CVQFs can be used for general statistical inference tasks: estimation of likelihoods, CDFs, confidence sets, coverage, sampling, and more.This work is an important step towards unlocking the full potential of VQR.
Vector quantile regression (VQR) estimates the conditional vector quantile function (CVQF), a fundamental quantity which fully represents the conditional distribution of $\mathbf{Y}|\mathbf{X}$. VQR is formulated as an optimal transport (OT) problem between a uniform $\mathbf{U}\sim\mu$ and the target $(\mathbf{X},\mathbf{Y})\sim\nu$, the solution of which is a unique transport map, co-monotonic with $\mathbf{U}$. Recently NL-VQR has been proposed to estimate support non-linear CVQFs, together with fast solvers which enabled the use of this tool in practical applications. Despite its utility, the scalability and estimation quality of NL-VQR is limited due to a discretization of the OT problem onto a grid of quantile levels. We propose a novel _continuous_ formulation and parametrization of VQR using partial input-convex neural networks (PICNNs). Our approach allows for accurate, scalable, differentiable and invertible estimation of non-linear CVQFs.We further demonstrate, theoretically and experimentally, how continuous CVQFs can be used for general statistical inference tasks: estimation of likelihoods, CDFs, confidence sets, coverage, sampling, and more.This work is an important step towards unlocking the full potential of VQR.
Author Information
Sanketh Vedula (Computer Science Department, Technion-Israel Institute of Technology)
Irene Tallini (University of Roma "La Sapienza")
Aviv A. Rosenberg (Faculty of Computer Science, Technion-Israel Institute of Technology)
Marco Pegoraro (Sapienza University of Rome)
Emanuele Rodola (Sapienza University of Rome)
Yaniv Romano (Technion---Israel Institute of Technology)
Alexander Bronstein (Technion)
More from the Same Authors
-
2023 : Leveraging sparse and shared feature activations for disentangled representation learning »
Marco Fumero · Florian Wenzel · Luca Zancato · Alessandro Achille · Emanuele Rodola · Stefano Soatto · Bernhard Schölkopf · Francesco Locatello -
2023 : Vector Quantile Regression on Manifolds »
Marco Pegoraro · Sanketh Vedula · Aviv A. Rosenberg · Irene Tallini · Emanuele Rodola · Alexander Bronstein -
2023 : Explanatory Learning: Towards Artificial Scientific Discovery »
Antonio Norelli · Giorgio Mariani · Luca Moschella · Andrea Santilli · Giambattista Parascandolo · Simone Melzi · Emanuele Rodola -
2023 : Infusing invariances in neural representations »
Irene Cannistraci · Marco Fumero · Luca Moschella · Valentino Maiorca · Emanuele Rodola -
2023 Poster: Conformal Prediction with Missing Values »
Margaux Zaffran · Aymeric Dieuleveut · Julie Josse · Yaniv Romano -
2022 Poster: An Asymptotic Test for Conditional Independence using Analytic Kernel Embeddings »
Meyer Scetbon · Laurent Meunier · Yaniv Romano -
2022 Spotlight: An Asymptotic Test for Conditional Independence using Analytic Kernel Embeddings »
Meyer Scetbon · Laurent Meunier · Yaniv Romano -
2022 Poster: Image-to-Image Regression with Distribution-Free Uncertainty Quantification and Applications in Imaging »
Anastasios Angelopoulos · Amit Pal Kohli · Stephen Bates · Michael Jordan · Jitendra Malik · Thayer Alshaabi · Srigokul Upadhyayula · Yaniv Romano -
2022 Poster: Coordinated Double Machine Learning »
Nitai Fingerhut · Matteo Sesia · Yaniv Romano -
2022 Spotlight: Coordinated Double Machine Learning »
Nitai Fingerhut · Matteo Sesia · Yaniv Romano -
2022 Spotlight: Image-to-Image Regression with Distribution-Free Uncertainty Quantification and Applications in Imaging »
Anastasios Angelopoulos · Amit Pal Kohli · Stephen Bates · Michael Jordan · Jitendra Malik · Thayer Alshaabi · Srigokul Upadhyayula · Yaniv Romano -
2021 Poster: Learning disentangled representations via product manifold projection »
Marco Fumero · Luca Cosmo · Simone Melzi · Emanuele Rodola -
2021 Spotlight: Learning disentangled representations via product manifold projection »
Marco Fumero · Luca Cosmo · Simone Melzi · Emanuele Rodola