Timezone: »

 
Poster
Quadratically Regularized Subgradient Methods for Weakly Convex Optimization with Weakly Convex Constraints
Runchao Ma · Qihang Lin · Tianbao Yang

Tue Jul 14 07:00 AM -- 07:45 AM & Tue Jul 14 06:00 PM -- 06:45 PM (PDT) @

Optimization models with non-convex constraints arise in many tasks in machine learning, e.g., learning with fairness constraints or Neyman-Pearson classification with non-convex loss. Although many efficient methods have been developed with theoretical convergence guarantees for non-convex unconstrained problems, it remains a challenge to design provably efficient algorithms for problems with non-convex functional constraints. This paper proposes a class of subgradient methods for constrained optimization where the objective function and the constraint functions are weakly convex and nonsmooth. Our methods solve a sequence of strongly convex subproblems, where a quadratic regularization term is added to both the objective function and each constraint function. Each subproblem can be solved by various algorithms for strongly convex optimization. Under a uniform Slater’s condition, we establish the computation complexities of our methods for finding a nearly stationary point.

Author Information

Runchao Ma (University of Iowa)
Qihang Lin (University of Iowa)
Tianbao Yang (The University of Iowa)

More from the Same Authors