Timezone: »
The composition of multiple Gaussian Processes as a Deep Gaussian Process DGP enables a deep probabilistic nonparametric approach to flexibly tackle complex machine learning problems with sound quantification of uncertainty. Existing inference approaches for DGP models have limited scalability and are notoriously cumbersome to construct. In this work we introduce a novel formulation of DGPs based on random feature expansions that we train using stochastic variational inference. This yields a practical learning framework which significantly advances the state-of-the-art in inference for DGPs, and enables accurate quantification of uncertainty. We extensively showcase the scalability and performance of our proposal on several datasets with up to 8 million observations, and various DGP architectures with up to 30 hidden layers.
Author Information
Kurt Cutajar (EURECOM)
Edwin Bonilla (UNSW)
Pietro Michiardi (EURECOM)
Pietro Michiardi received his M.S. in Computer Science from EURECOM and his M.S. in Electrical Engineering from Politecnico di Torino. Pietro received his Ph.D. in Computer Science from Telecom ParisTech (former ENST, Paris), and his HDR (Habilitation) from UNSA. Today, Pietro is a Professor of Computer Science at EURECOM, where he leads the Distributed System Group, which blends theory and system research focusing on large-scale distributed systems (including data processing and data storage), and scalable algorithm design to mine massive amounts of data. Additional research interests are on system, algorithmic, and performance evaluation aspects of distributed systems. Pietro has been appointed as Data Science department head in May 2016.
Maurizio Filippone (Eurecom)
Related Events (a corresponding poster, oral, or spotlight)
-
2017 Talk: Random Feature Expansions for Deep Gaussian Processes »
Wed. Aug 9th 06:06 -- 06:24 AM Room C4.9& C4.10
More from the Same Authors
-
2022 : A New Look on Diffusion Times for Score-based Generative Models »
Giulio Franzese · Simone Rossi · Lixuan YANG · alessandro finamore · Dario Rossi · Maurizio Filippone · Pietro Michiardi -
2022 Poster: Revisiting the Effects of Stochasticity for Hamiltonian Samplers »
Giulio Franzese · Dimitrios Milios · Maurizio Filippone · Pietro Michiardi -
2022 Spotlight: Revisiting the Effects of Stochasticity for Hamiltonian Samplers »
Giulio Franzese · Dimitrios Milios · Maurizio Filippone · Pietro Michiardi -
2021 Poster: Sparse within Sparse Gaussian Processes using Neighbor Information »
Gia-Lac Tran · Dimitrios Milios · Pietro Michiardi · Maurizio Filippone -
2021 Spotlight: Sparse within Sparse Gaussian Processes using Neighbor Information »
Gia-Lac Tran · Dimitrios Milios · Pietro Michiardi · Maurizio Filippone -
2021 Poster: An Identifiable Double VAE For Disentangled Representations »
Graziano Mita · Maurizio Filippone · Pietro Michiardi -
2021 Spotlight: An Identifiable Double VAE For Disentangled Representations »
Graziano Mita · Maurizio Filippone · Pietro Michiardi -
2019 Poster: Good Initializations of Variational Bayes for Deep Models »
Simone Rossi · Pietro Michiardi · Maurizio Filippone -
2019 Oral: Good Initializations of Variational Bayes for Deep Models »
Simone Rossi · Pietro Michiardi · Maurizio Filippone -
2018 Poster: Constraining the Dynamics of Deep Probabilistic Models »
Marco Lorenzi · Maurizio Filippone -
2018 Oral: Constraining the Dynamics of Deep Probabilistic Models »
Marco Lorenzi · Maurizio Filippone -
2018 Poster: Variational Network Inference: Strong and Stable with Concrete Support »
Amir Dezfouli · Edwin Bonilla · Richard Nock -
2018 Oral: Variational Network Inference: Strong and Stable with Concrete Support »
Amir Dezfouli · Edwin Bonilla · Richard Nock