Semismooth Newton Algorithm for Efficient Projections onto $\ell_{1, \infty}$-norm Ball

Dejun Chu · Changshui Zhang · Shiliang Sun · Qing Tao

Keywords: [ Convex Optimization ] [ Large Scale Learning and Big Data ] [ Sparsity and Compressed Sensing ] [ Optimization - Convex ]

[ Abstract ]
Tue 14 Jul 7 a.m. PDT — 7:45 a.m. PDT
Tue 14 Jul 8 p.m. PDT — 8:45 p.m. PDT

Abstract: Structured sparsity-inducing $\ell_{1, \infty}$-norm, as a generalization of the classical $\ell_1$-norm, plays an important role in jointly sparse models which select or remove simultaneously all the variables forming a group. However, its resulting problem is more difficult to solve than the conventional $\ell_1$-norm constrained problem. In this paper, we propose an efficient algorithm for Euclidean projection onto $\ell_{1, \infty}$-norm ball. We tackle the projection problem via semismooth Newton algorithm to solve the system of semismooth equations. Meanwhile, exploiting the structure of Jacobian matrix via LU decomposition yields an equivalent algorithm which is proved to terminate after a finite number of iterations. Empirical studies demonstrate that our proposed algorithm outperforms the existing state-of-the-art solver and is promising for the optimization of learning problems with $\ell_{1, \infty}$-norm ball constraint.

Chat is not available.