Timezone: »
We introduce Minimal Achievable Sufficient Statistic (MASS) Learning, a machine learning training objective for which the minima are minimal sufficient statistics with respect to a class of functions being optimized over (e.g., deep networks). In deriving MASS Learning, we also introduce Conserved Differential Information (CDI), an information-theoretic quantity that — unlike standard mutual information — can be usefully applied to deterministically-dependent continuous random variables like the input and output of a deep network. In a series of experiments, we show that deep networks trained with MASS Learning achieve competitive performance on supervised learning, regularization, and uncertainty quantification benchmarks.
Author Information
Milan Cvitkovic (California Institute of Technology)
Günther Koliander (Austrian Academy of Sciences)
Related Events (a corresponding poster, oral, or spotlight)
-
2019 Oral: Minimal Achievable Sufficient Statistic Learning »
Wed. Jun 12th 11:30 -- 11:35 PM Room Grand Ballroom
More from the Same Authors
-
2022 Poster: A Differential Entropy Estimator for Training Neural Networks »
Georg Pichler · Pierre Colombo · Malik Boudiaf · Günther Koliander · Pablo Piantanida -
2022 Spotlight: A Differential Entropy Estimator for Training Neural Networks »
Georg Pichler · Pierre Colombo · Malik Boudiaf · Günther Koliander · Pablo Piantanida -
2019 Poster: Open Vocabulary Learning on Source Code with a Graph-Structured Cache »
Milan Cvitkovic · Badal Singh · Anima Anandkumar -
2019 Oral: Open Vocabulary Learning on Source Code with a Graph-Structured Cache »
Milan Cvitkovic · Badal Singh · Anima Anandkumar