Skip to yearly menu bar Skip to main content


Fast Excess Risk Rates via Offset Rademacher Complexity

Chenguang Duan · Yuling Jiao · Lican Kang · Xiliang Lu · Jerry Yang

Exhibit Hall 1 #903
[ ]
[ PDF [ Poster


Based on the offset Rademacher complexity, this work outlines a systematical framework for deriving sharp excess risk bounds in statistical learning without Bernstein condition. In addition to recovering fast rates in a unified way for some parametric and nonparametric supervised learning models with minimum identifiability assumptions, we also obtain new and improved results for LAD (sparse) linear regression and deep logistic regression with deep ReLU neural networks, respectively.

Chat is not available.