Skip to yearly menu bar Skip to main content


Talk
in
Workshop: Beyond first order methods in machine learning systems

Spotlight talk 6 - Competitive Mirror Descent

Florian Schäfer


Abstract:

Constrained competitive optimization involves multiple agents trying to minimize conflicting objectives, subject to constraints. This is a highly expressive modeling language that subsumes most of modern machine learning.In this work we propose competitive mirror descent (CMD): a general method for solving such problems based on first order information that can be obtained by automatic differentiation.First, by adding Lagrange multipliers, we obtain a simplified constraint set with an associated Bregman potential.At each iteration, we then solve for the Nash equilibrium of a regularized bilinear approximation of the full problem to obtain a direction of movement of the agents.Finally, we obtain the next iterate by following this direction according to the dual geometry induced by the Bregman potential.By using the dual geometry we obtain feasible iterates despite only solving a linear system at each iteration, eliminating the need for projection steps while still accounting for the global nonlinear structure of the constraint set.As a special case we obtain a novel competitive multiplicative weights algorithm for problems on the positive cone.

Chat is not available.