Timezone: »

 
Poster
Multiplicative Weights Updates as a distributed constrained optimization algorithm: Convergence to second-order stationary points almost always
Ioannis Panageas · Georgios Piliouras · xiao wang

Tue Jun 11 06:30 PM -- 09:00 PM (PDT) @ Pacific Ballroom #99
Non-concave maximization has been the subject of much recent study in the optimization and machine learning communities, specifically in deep learning. Recent papers ([Ge et al. 2015, Lee et al 2017] and references therein) indicate that first order methods work well and avoid saddles points. Results as in [Lee \etal 2017], however, are limited to the \textit{unconstrained} case or for cases where the critical points are in the interior of the feasibility set, which fail to capture some of the most interesting applications. In this paper we focus on \textit{constrained} non-concave maximization. We analyze a variant of a well-established algorithm in machine learning called Multiplicative Weights Update (MWU) for the maximization problem $\max_{\mathbf{x} \in D} P(\mathbf{x})$, where $P$ is non-concave, twice continuously differentiable and $D$ is a product of simplices. We show that MWU converges almost always for small enough stepsizes to critical points that satisfy the second order KKT conditions, by combining techniques from dynamical systems as well as taking advantage of a recent connection between Baum Eagon inequality and MWU [Palaiopanos et al 2017].

Author Information

Ioannis Panageas (SUTD)
Georgios Piliouras (Singapore University of Technology and Design)
xiao wang (Singapore university of technology and design)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors