Skip to yearly menu bar Skip to main content


Disambiguation of Weak Supervision leading to Exponential Convergence rates

Vivien Cabannnes · Francis Bach · Alessandro Rudi

Keywords: [ Models of Learning and Generalization ]


Machine learning approached through supervised learning requires expensive annotation of data. This motivates weakly supervised learning, where data are annotated with incomplete yet discriminative information. In this paper, we focus on partial labelling, an instance of weak supervision where, from a given input, we are given a set of potential targets. We review a disambiguation principle to recover full supervision from weak supervision, and propose an empirical disambiguation algorithm. We prove exponential convergence rates of our algorithm under classical learnability assumptions, and we illustrate the usefulness of our method on practical examples.

Chat is not available.