Timezone: »
Modern machine learning methods often require more data for training than a single expert can provide. Therefore, it has become a standard procedure to collect data from multiple external sources, \eg via crowdsourcing. Unfortunately, the quality of these sources is not always guaranteed. As further complications, the data might be stored in a distributed way, or might even have to remain private. In this work, we address the question of how to learn robustly in such scenarios. Studying the problem through the lens of statistical learning theory, we derive a procedure that allows for learning from all available sources, yet automatically suppresses irrelevant or corrupted data. We show by extensive experiments that our method provides significant improvements over alternative approaches from robust statistics and distributed optimization.
Author Information
Nikola Konstantinov (IST Austria)
Christoph H. Lampert (IST Austria)
Related Events (a corresponding poster, oral, or spotlight)
-
2019 Oral: Robust Learning from Untrusted Sources »
Wed. Jun 12th 06:40 -- 07:00 PM Room Room 102
More from the Same Authors
-
2021 : Invited talk1:Q&A »
Christoph H. Lampert -
2020 : Invited Talk: Christoph H. Lampert "Learning Theory for Continual and Meta-Learning" »
Christoph H. Lampert -
2020 Poster: On the Sample Complexity of Adversarial Multi-Source PAC Learning »
Nikola Konstantinov · Elias Frantar · Dan Alistarh · Christoph H. Lampert -
2019 Poster: Towards Understanding Knowledge Distillation »
Mary Phuong · Christoph H. Lampert -
2019 Oral: Towards Understanding Knowledge Distillation »
Mary Phuong · Christoph H. Lampert -
2018 Poster: Learning equations for extrapolation and control »
Subham S Sahoo · Christoph H. Lampert · Georg Martius -
2018 Oral: Learning equations for extrapolation and control »
Subham S Sahoo · Christoph H. Lampert · Georg Martius -
2018 Poster: Data-Dependent Stability of Stochastic Gradient Descent »
Ilja Kuzborskij · Christoph H. Lampert -
2018 Oral: Data-Dependent Stability of Stochastic Gradient Descent »
Ilja Kuzborskij · Christoph H. Lampert -
2017 Poster: PixelCNN Models with Auxiliary Variables for Natural Image Modeling »
Alexander Kolesnikov · Christoph H. Lampert -
2017 Poster: Multi-task Learning with Labeled and Unlabeled Tasks »
Anastasia Pentina · Christoph H. Lampert -
2017 Talk: Multi-task Learning with Labeled and Unlabeled Tasks »
Anastasia Pentina · Christoph H. Lampert -
2017 Talk: PixelCNN Models with Auxiliary Variables for Natural Image Modeling »
Alexander Kolesnikov · Christoph H. Lampert