Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Principles of Distribution Shift (PODS)

Maximum Mean Discrepancy Distributionally Robust Nonlinear Chance-Constrained Optimization with Finite-Sample Guarantee

Yassine Nemmour · Heiner Kremer · Bernhard Schölkopf · Jia-Jie Zhu


Abstract:

Distributionally robust chance-constrained pro-grams (DRCCP) provide a powerful frameworkfor chance constraint optimization in presenceof distributional uncertainty. However, such pro-grams based on the popular Wasserstein ambigu-ity sets usually require restrictive assumptions onthe constraint functions. To overcome these limi-tations, we propose a practical DRCCP algorithmusing kernel maximum mean discrepancy (MMD)ambiguity sets, which we term MMD-DRCCP, totreat general nonlinear constraints without usingad-hoc reformulation techniques. MMD-DRCCPcan handle general nonlinear and non-convex con-straints with a proven finite-sample constraint sat-isfaction guarantee of a dimension-independent\mathcal{O}(\frac{1}{{N}})rate, achievable by a practical algorithm.We further propose an efficient bootstrap schemefor constructing sharp MMD ambiguity sets inpractice without relying on computationally costlycross validation procedures.

Chat is not available.