Poster
Stochastic Deep Networks
Gwendoline De Bie · Gabriel Peyré · Marco Cuturi

Tue Jun 11th 06:30 -- 09:00 PM @ Pacific Ballroom #30

Machine learning is increasingly targeting areas where input data cannot be accurately described by a single vector, but can be modeled instead using the more flexible concept of random vectors, namely probability measures or more simply point clouds of varying cardinality. Using deep architectures on measures poses, however, many challenging issues. Indeed, deep architectures are originally designed to handle fixed-length vectors, or, using recursive mechanisms, ordered sequences thereof. In sharp contrast, measures describe a varying number of weighted observations with no particular order. We propose in this work a deep framework designed to handle crucial aspects of measures, namely permutation invariances, variations in weights and cardinality. Architectures derived from this pipeline can (i) map measures to measures - using the concept of push-forward operators; (ii) bridge the gap between measures and Euclidean spaces - through integration steps. This allows to design discriminative networks (to classify or reduce the dimensionality of input measures), generative architectures (to synthesize measures) and recurrent pipelines (to predict measure dynamics). We provide a theoretical analysis of these building blocks, review our architectures' approximation abilities and robustness w.r.t. perturbation, and try them on various discriminative and generative tasks.

Author Information

Gwendoline De Bie (Ecole normale supérieure)
Gabriel Peyré (CNRS and ENS)
Marco Cuturi (ENSAE / CREST)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors