Skip to yearly menu bar Skip to main content


Poster

Projection onto Minkowski Sums with Application to Constrained Learning

Joong-Ho (Johann) Won · Jason Xu · Kenneth Lange

Pacific Ballroom #190

Keywords: [ Convex Optimization ] [ Large Scale Learning and Big Data ] [ Optimization - Others ] [ Sparsity and Compressed Sensing ] [ Transfer and Multitask Learning ]


Abstract: We introduce block descent algorithms for projecting onto Minkowski sums of sets. Projection onto such sets is a crucial step in many statistical learning problems, and may regularize complexity of solutions to an optimization problem or arise in dual formulations of penalty methods. We show that projecting onto the Minkowski sum admits simple, efficient algorithms when complications such as overlapping constraints pose challenges to existing methods. We prove that our algorithm converges linearly when sets are strongly convex or satisfy an error bound condition, and extend the theory and methods to encompass non-convex sets as well. We demonstrate empirical advantages in runtime and accuracy over competitors in applications to $\ell_{1,p}$-regularized learning, constrained lasso, and overlapping group lasso.

Live content is unavailable. Log in and register to view live content