Timezone: »

Optimistic Bounds for Multi-output Learning
Henry Reeve · Ata Kaban

Thu Jul 16 12:00 PM -- 12:45 PM & Fri Jul 17 01:00 AM -- 01:45 AM (PDT) @ None #None

We investigate the challenge of multi-output learning, where the goal is to learn a vector-valued function based on a supervised data set. This includes a range of important problems in Machine Learning including multi-target regression, multi-class classification and multi-label classification. We begin our analysis by introducing the self-bounding Lipschitz condition for multi-output loss functions, which interpolates continuously between a classical Lipschitz condition and a multi-dimensional analogue of a smoothness condition. We then show that the self-bounding Lipschitz condition gives rise to optimistic bounds for multi-output learning, which attain the minimax optimal rate up to logarithmic factors. The proof exploits local Rademacher complexity combined with a powerful minoration inequality due to Srebro, Sridharan and Tewari. As an application we derive a state-of-the-art generalisation bound for multi-class gradient boosting.

Author Information

Henry Reeve (University of Birmingham)

I am a postdoctoral research fellow in Machine Learning at the University of Birmingham. I am working on the FORGING EPSRC research project led by Professor Ata Kabán. The goal of the project is both to explore geometric structures which enable efficient learning from small data samples in high dimensional learning scenarios. We are particularly focused on the role played by Random Projections of the data onto a low dimensional subspace. Within the scope of this project I am focusing on a variety of problems including learning with label noise, mixture proportion estimation, learning in unbounded domains and matrix factorisation. I did a PhD in the School of Computer Science at the University of Manchester under the supervision of Professor Gavin Brown. My research focus was on learning scenarios with asymmetric costs in high dimensions with connections to Neyman Pearson classification and multi armed bandits. Areas of interest: Minimax rates, Label noise, Mixture proportion estimation, Weakly supervised learning, Random projections, Compressive learning, Matrix factorisation, Dimensionality reduction, Multi armed bandits.

Ata Kaban (University of Birmingham)

More from the Same Authors