Skip to yearly menu bar Skip to main content


Poster

Large-Scale Sparse Kernel Canonical Correlation Analysis

Viivi Uurtio · Sahely Bhadra · Juho Rousu

Pacific Ballroom #226

Keywords: [ Kernel Methods ] [ Unsupervised Learning ]


Abstract: This paper presents gradKCCA, a large-scale sparse non-linear canonical correlation method. Like Kernel Canonical Correlation Analysis (KCCA), our method finds non-linear relations through kernel functions, but it does not rely on a kernel matrix, a known bottleneck for scaling up kernel methods. gradKCCA corresponds to solving KCCA with the additional constraint that the canonical projection directions in the kernel-induced feature space have preimages in the original data space. Firstly, this modification allows us to very efficiently maximize kernel canonical correlation through an alternating projected gradient algorithm working in the original data space. Secondly, we can control the sparsity of the projection directions by constraining the $\ell_1$ norm of the preimages of the projection directions, facilitating the interpretation of the discovered patterns, which is not available through KCCA. Our empirical experiments demonstrate that gradKCCA outperforms state-of-the-art CCA methods in terms of speed and robustness to noise both in simulated and real-world datasets.

Live content is unavailable. Log in and register to view live content