Minimal Achievable Sufficient Statistic Learning
Milan Cvitkovic · Günther Koliander
Keywords:
Adversarial Examples
Computer Vision
Deep Learning Theory
Information Theory and Estimation
Representation Learning
2019 Poster
Abstract
We introduce Minimal Achievable Sufficient Statistic (MASS) Learning, a machine learning training objective for which the minima are minimal sufficient statistics with respect to a class of functions being optimized over (e.g., deep networks). In deriving MASS Learning, we also introduce Conserved Differential Information (CDI), an information-theoretic quantity that — unlike standard mutual information — can be usefully applied to deterministically-dependent continuous random variables like the input and output of a deep network. In a series of experiments, we show that deep networks trained with MASS Learning achieve competitive performance on supervised learning, regularization, and uncertainty quantification benchmarks.
Chat is not available.
Successful Page Load