Timezone: »

Strength from Weakness: Fast Learning Using Weak Supervision
Joshua Robinson · Stefanie Jegelka · Suvrit Sra

Wed Jul 15 08:00 AM -- 08:45 AM & Wed Jul 15 09:00 PM -- 09:45 PM (PDT) @ None #None

We study generalization properties of weakly supervised learning, that is, learning where only a few "strong" labels (the actual target for prediction) are present but many more "weak" labels are available. In particular, we show that pretraining using weak labels and finetuning using strong can accelerate the learning rate for the strong task to the fast rate of O(1/n), where n is the number of strongly labeled data points. This acceleration can happen even if, by itself, the strongly labeled data admits only the slower O(1/\sqrt{n}) rate. The acceleration depends continuously on the number of weak labels available, and on the relation between the two tasks. Our theoretical results are reflected empirically across a range of tasks and illustrate how weak labels speed up learning on the strong task.

Author Information

Joshua Robinson (MIT)

I want to understand how machines can learn useful representations of the world. I am also interested in modeling diversity and its many applications in learning problems. I am Josh Robinson, a PhD student at MIT CSAIL & LIDS advised by Stefanie Jegelka and Suvrit Sra. I am part of the MIT machine learning group. Previously I was an undergraduate at the University of Warwick where I worked with Robert MacKay on probability theory.

Stefanie Jegelka (MIT)
Suvrit Sra (MIT)

More from the Same Authors