Skip to yearly menu bar Skip to main content


Poster

Mediated Uncoupled Learning: Learning Functions without Direct Input-output Correspondences

Ikko Yamane · Junya Honda · Florian YGER · Masashi Sugiyama

Virtual

Keywords: [ Semi-supervised learning ] [ Algorithms ] [ Classification ]


Abstract: Ordinary supervised learning is useful when we have paired training data of input XX and output YY. However, such paired data can be difficult to collect in practice. In this paper, we consider the task of predicting YY from XX when we have no paired data of them, but we have two separate, independent datasets of XX and YY each observed with some mediating variable UU, that is, we have two datasets SX={(Xi,Ui)}SX={(Xi,Ui)} and SY={(Uj,Yj)}SY={(Uj,Yj)}. A naive approach is to predict UU from XX using SXSX and then YY from UU using SYSY, but we show that this is not statistically consistent. Moreover, predicting UU can be more difficult than predicting YY in practice, e.g., when UU has higher dimensionality. To circumvent the difficulty, we propose a new method that avoids predicting UU but directly learns Y=f(X)Y=f(X) by training f(X)f(X) with SXSX to predict h(U)h(U) which is trained with SYSY to approximate YY. We prove statistical consistency and error bounds of our method and experimentally confirm its practical usefulness.

Chat is not available.