Skip to yearly menu bar Skip to main content


Poster

Dimensionality Reduction for General KDE Mode Finding

Xinyu Luo · Christopher Musco · Cas Widdershoven

Exhibit Hall 1 #429
[ ]
[ PDF [ Poster

Abstract: Finding the mode of a high dimensional probability distribution DD is a fundamental algorithmic problem in statistics and data analysis. There has been particular interest in efficient methods for solving the problem when DD is represented as a mixture model or kernel density estimate, although few algorithmic results with worst-case approximation and runtime guarantees are known. In this work, we significantly generalize a result of (LeeLiMusco:2021) on mode approximation for Gaussian mixture models. We develop randomized dimensionality reduction methods for mixtures involving a broader class of kernels, including the popular logistic, sigmoid, and generalized Gaussian kernels. As in Lee et al.'s work, our dimensionality reduction results yield quasi-polynomial algorithms for mode finding with multiplicative accuracy (1ϵ)(1ϵ) for any ϵ>0ϵ>0. Moreover, when combined with gradient descent, they yield efficient practical heuristics for the problem. In addition to our positive results, we prove a hardness result for box kernels, showing that there is no polynomial time algorithm for finding the mode of a kernel density estimate, unless P=NPP=NP. Obtaining similar hardness results for kernels used in practice (like Gaussian or logistic kernels) is an interesting future direction.

Chat is not available.