Timezone: »

Generalised Lipschitz Regularisation Equals Distributional Robustness
Zac Cranko · Zhan Shi · Xinhua Zhang · Richard Nock · Simon Kornblith

Wed Jul 21 07:25 AM -- 07:30 AM (PDT) @ None

The problem of adversarial examples has highlighted the need for a theory of regularisation that is general enough to apply to exotic function classes, such as universal approximators. In response, we have been able to significantly sharpen existing results regarding the relationship between distributional robustness and regularisation, when defined with a transportation cost uncertainty set. The theory allows us to characterise the conditions under which the distributional robustness equals a Lipschitz-regularised model, and to tightly quantify, for the first time, the slackness under very mild assumptions. As a theoretical application we show a new result explicating the connection between adversarial learning and distributional robustness. We then give new results for how to achieve Lipschitz regularisation of kernel classifiers, which are demonstrated experimentally.

Author Information

Zac Cranko (Universität Tübingen)
Zhan Shi (University of Illinois at Chicago)
Xinhua Zhang (University of Illinois at Chicago)
Richard Nock (Google Brain)
Simon Kornblith (Google Brain)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors