Poster
Minimal Achievable Sufficient Statistic Learning
Milan Cvitkovic · Günther Koliander
Pacific Ballroom #84
Keywords: [ Adversarial Examples ] [ Computer Vision ] [ Deep Learning Theory ] [ Information Theory and Estimation ] [ Representation Learning ]
We introduce Minimal Achievable Sufficient Statistic (MASS) Learning, a machine learning training objective for which the minima are minimal sufficient statistics with respect to a class of functions being optimized over (e.g., deep networks). In deriving MASS Learning, we also introduce Conserved Differential Information (CDI), an information-theoretic quantity that — unlike standard mutual information — can be usefully applied to deterministically-dependent continuous random variables like the input and output of a deep network. In a series of experiments, we show that deep networks trained with MASS Learning achieve competitive performance on supervised learning, regularization, and uncertainty quantification benchmarks.
Live content is unavailable. Log in and register to view live content