Timezone: »

Random Feature Expansions for Deep Gaussian Processes
Kurt Cutajar · Edwin Bonilla · Pietro Michiardi · Maurizio Filippone

Wed Aug 09 01:30 AM -- 05:00 AM (PDT) @ Gallery #126

The composition of multiple Gaussian Processes as a Deep Gaussian Process DGP enables a deep probabilistic nonparametric approach to flexibly tackle complex machine learning problems with sound quantification of uncertainty. Existing inference approaches for DGP models have limited scalability and are notoriously cumbersome to construct. In this work we introduce a novel formulation of DGPs based on random feature expansions that we train using stochastic variational inference. This yields a practical learning framework which significantly advances the state-of-the-art in inference for DGPs, and enables accurate quantification of uncertainty. We extensively showcase the scalability and performance of our proposal on several datasets with up to 8 million observations, and various DGP architectures with up to 30 hidden layers.

Author Information

Kurt Cutajar (EURECOM)
Edwin Bonilla (UNSW)
Pietro Michiardi (EURECOM)

Pietro Michiardi received his M.S. in Computer Science from EURECOM and his M.S. in Electrical Engineering from Politecnico di Torino. Pietro received his Ph.D. in Computer Science from Telecom ParisTech (former ENST, Paris), and his HDR (Habilitation) from UNSA. Today, Pietro is a Professor of Computer Science at EURECOM, where he leads the Distributed System Group, which blends theory and system research focusing on large-scale distributed systems (including data processing and data storage), and scalable algorithm design to mine massive amounts of data. Additional research interests are on system, algorithmic, and performance evaluation aspects of distributed systems. Pietro has been appointed as Data Science department head in May 2016.

Maurizio Filippone (Eurecom)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors