Skip to yearly menu bar Skip to main content


Poster

Efficient Algorithms for Sum-Of-Minimum Optimization

Lisang Ding · Ziang Chen · Xinshang Wang · Wotao Yin


Abstract: In this work, we propose a novel optimization model termed ``sum-of-minimum" optimization. This model seeks to minimize the sum or average of $N$ objective functions over $k$ parameters, where each objective takes the minimum value of a predefined sub-function with respect to the $k$ parameters. This is a universal framework covering numerous clustering applications in machine learning and related fields.We develop efficient algorithms for solving the sum-of-minimum optimization problems, motivated by a randomized initialization algorithm for classic $k$-means and Lloyd's algorithm. We establish a new tight bound for the generalized initialization algorithm and prove a gradient-descent-like convergence rate for the generalized Lloyd's algorithm. The efficiency of our algorithms is numerically examined on multiple tasks including generalized principal component analysis, mixed linear regression, and small-scale neural network training. Our approach compares favorably to previous ones that are based on simpler-but-less-precise optimization reformulations.

Live content is unavailable. Log in and register to view live content