Timezone: »
We introduce Minimal Achievable Sufficient Statistic (MASS) Learning, a training objective for machine learning models whose minimizers are minimal sufficient statistics with respect to the class of functions being optimized over (e.g. deep networks). In deriving MASS Learning, we also introduce Conserved Differential Information (CDI), an information-theoretic quantity that - unlike standard mutual information - can be usefully applied to deterministically-dependent continuous random variables like the input and output of a deep network. In a series of experiments, we show that deep networks trained with MASS Learning match state–of–the–art performance on supervised learning, uncertainty quantification, and adversarial robustness benchmarks.
Author Information
Milan Cvitkovic (California Institute of Technology)
Günther Koliander (Austrian Academy of Sciences)
Related Events (a corresponding poster, oral, or spotlight)
-
2019 Poster: Minimal Achievable Sufficient Statistic Learning »
Thu. Jun 13th 01:30 -- 04:00 AM Room Pacific Ballroom #84
More from the Same Authors
-
2022 Poster: A Differential Entropy Estimator for Training Neural Networks »
Georg Pichler · Pierre Colombo · Malik Boudiaf · Günther Koliander · Pablo Piantanida -
2022 Spotlight: A Differential Entropy Estimator for Training Neural Networks »
Georg Pichler · Pierre Colombo · Malik Boudiaf · Günther Koliander · Pablo Piantanida -
2019 Poster: Open Vocabulary Learning on Source Code with a Graph-Structured Cache »
Milan Cvitkovic · Badal Singh · Anima Anandkumar -
2019 Oral: Open Vocabulary Learning on Source Code with a Graph-Structured Cache »
Milan Cvitkovic · Badal Singh · Anima Anandkumar