Skip to yearly menu bar Skip to main content


Poster

Kernel Normalized Cut: a Theoretical Revisit

Yoshikazu Terada · Michio Yamamoto

Pacific Ballroom #194

Keywords: [ Unsupervised Learning ] [ Unsupervised and Semi-supervised Learning ] [ Kernel Methods ] [ Clustering ]


Abstract:

In this paper, we study the theoretical properties of clustering based on the kernel normalized cut. Our first contribution is to derive a nonasymptotic upper bound on the expected distortion rate of the kernel normalized cut. From this result, we show that the solution of the kernel normalized cut converges to that of the population-level weighted k-means clustering on a certain reproducing kernel Hilbert space (RKHS). Our second contribution is the discover of the interesting fact that the population-level weighted k-means clustering in the RKHS is equivalent to the population-level normalized cut. Combining these results, we can see that the kernel normalized cut converges to the population-level normalized cut. The criterion of the population-level normalized cut can be considered as an indivisibility of the population distribution, and this criterion plays an important role in the theoretical analysis of spectral clustering in Schiebinger et al. (2015). We believe that our results will provide deep insights into the behavior of both normalized cut and spectral clustering.

Live content is unavailable. Log in and register to view live content