Skip to yearly menu bar Skip to main content


Spotlight

Near-Optimal Algorithms for Explainable k-Medians and k-Means

Konstantin Makarychev · Liren Shan

Abstract: We consider the problem of explainable k-medians and k-means introduced by Dasgupta, Frost, Moshkovitz, and Rashtchian~(ICML 2020). In this problem, our goal is to find a \emph{threshold decision tree} that partitions data into k clusters and minimizes the k-medians or k-means objective. The obtained clustering is easy to interpret because every decision node of a threshold tree splits data based on a single feature into two groups. We propose a new algorithm for this problem which is O~(logk) competitive with k-medians with 1 norm and O~(k) competitive with k-means. This is an improvement over the previous guarantees of O(k) and O(k2) by Dasgupta et al (2020). We also provide a new algorithm which is O(log\nicefrac32k) competitive for k-medians with 2 norm. Our first algorithm is near-optimal: Dasgupta et al (2020) showed a lower bound of Ω(logk) for k-medians; in this work, we prove a lower bound of Ω~(k) for k-means. We also provide a lower bound of Ω(logk) for k-medians with 2 norm.

Chat is not available.