Timezone: »
Traditional methods for the discovery of latent network structures are limited in two ways: they either assume that all the signal comes from the network (i.e. there is no source of signal outside the network) or they place constraints on the network parameters to ensure model or algorithmic stability. We address these limitations by proposing a model that incorporates a Gaussian process prior on a network-independent component and formally proving that we get algorithmic stability for free while providing a novel perspective on model stability as well as robustness results and precise intervals for key inference parameters. We show that, on three applications, our approach outperforms previous methods consistently.
Author Information
Amir Dezfouli (UNSW)
Edwin Bonilla (UNSW)
Richard Nock (Data61, The Australian National University and the University of Sydney)
Related Events (a corresponding poster, oral, or spotlight)
-
2018 Poster: Variational Network Inference: Strong and Stable with Concrete Support »
Wed. Jul 11th 04:15 -- 07:00 PM Room Hall B
More from the Same Authors
-
2022 Poster: Neural Network Poisson Models for Behavioural and Neural Spike Train Data »
Moein Khajehnejad · Forough Habibollahi · Richard Nock · Ehsan Arabzadeh · Peter Dayan · Amir Dezfouli -
2022 Spotlight: Neural Network Poisson Models for Behavioural and Neural Spike Train Data »
Moein Khajehnejad · Forough Habibollahi · Richard Nock · Ehsan Arabzadeh · Peter Dayan · Amir Dezfouli -
2020 Poster: Supervised learning: no loss no cry »
Richard Nock · Aditya Menon -
2019 Poster: Monge blunts Bayes: Hardness Results for Adversarial Training »
Zac Cranko · Aditya Menon · Richard Nock · Cheng Soon Ong · Zhan Shi · Christian Walder -
2019 Poster: Lossless or Quantized Boosting with Integer Arithmetic »
Richard Nock · Robert C Williamson -
2019 Oral: Lossless or Quantized Boosting with Integer Arithmetic »
Richard Nock · Robert C Williamson -
2019 Oral: Monge blunts Bayes: Hardness Results for Adversarial Training »
Zac Cranko · Aditya Menon · Richard Nock · Cheng Soon Ong · Zhan Shi · Christian Walder -
2019 Poster: Boosted Density Estimation Remastered »
Zac Cranko · Richard Nock -
2019 Oral: Boosted Density Estimation Remastered »
Zac Cranko · Richard Nock -
2017 Workshop: Human in the Loop Machine Learning »
Richard Nock · Cheng Soon Ong -
2017 Poster: Random Feature Expansions for Deep Gaussian Processes »
Kurt Cutajar · Edwin Bonilla · Pietro Michiardi · Maurizio Filippone -
2017 Talk: Random Feature Expansions for Deep Gaussian Processes »
Kurt Cutajar · Edwin Bonilla · Pietro Michiardi · Maurizio Filippone