Fast Excess Risk Rates via Offset Rademacher Complexity
Chenguang Duan · Yuling Jiao · Lican Kang · Xiliang Lu · Jerry Yang
2023 Poster
Abstract
Based on the offset Rademacher complexity, this work outlines a systematical framework for deriving sharp excess risk bounds in statistical learning without Bernstein condition. In addition to recovering fast rates in a unified way for some parametric and nonparametric supervised learning models with minimum identifiability assumptions, we also obtain new and improved results for LAD (sparse) linear regression and deep logistic regression with deep ReLU neural networks, respectively.
Video
Chat is not available.
Successful Page Load