Skip to yearly menu bar Skip to main content


Poster

Efficient Algorithms for Empirical Group Distributional Robust Optimization and Beyond

Dingzhi Yu · Yunuo Cai · Wei Jiang · Lijun Zhang


Abstract: We investigate the empirical counterpart of group distributionally robust optimization (GDRO), which aims to maximize the minimal empirical risk across $m$ distinct groups. We formulate empirical GDRO as a *two-level* finite-sum convex-concave optimization problem and develop a stochastic variance reduced mirror prox algorithm. Different from existing methods, we construct the stochastic gradient by per-group sampling technique and perform variance reduction for all groups, which fully exploit the *two-level* finite-sum structure of empirical GDRO. What's more, we construct the snapshot and mirror snapshot point by one-index-shifted weighted average, which distinguishes us from naive ergodic average. Our algorithm also supports adaptive parameters which is beneficial according to our experiments. We establish convergence guarantees both in expectation and with high probability, demonstrating a complexity of $\mathcal{O}\left(\frac{m\sqrt{\bar{n}\ln{m}}}{\varepsilon}\right)$. Remarkably, our approach outperforms the state-of-the-art by a factor of $\sqrt{m}$. Furthermore, we extend our methodology to deal with the empirical minimax excess risk optimization (MERO) problem and manage to give the expectation bound and the high probability bound, accordingly. The complexity of our empirical MERO algorithm matches that of empirical GDRO at $\mathcal{O}\left(\frac{m\sqrt{\bar{n}\ln{m}}}{\varepsilon}\right)$, significantly surpassing the bounds of existing methods.

Live content is unavailable. Log in and register to view live content