Skip to yearly menu bar Skip to main content


Plenary Talk
in
Workshop: Beyond first-order methods in machine learning systems

Algorithms for Deterministically Constrained Stochastic Optimization

Frank E Curtis


Abstract:

We discuss algorithms for solving optimization problems in which the objective function is defined by an expectation and the constraints are deterministic. Numerous algorithms exist for solving such problems when the constraints define a convex feasible region. By contrast, we propose algorithms for situations in which the constraints are nonlinear equations and/or inequalities that define nonconvex feasible regions. Our algorithms, for which convergence and worst-case complexity guarantees are offered in expectation, vastly outperform naive extensions of first-order methods to this constrained setting.