Timezone: »
A major challenge in Bayesian Optimization is the boundary issue where an algorithm spends too many evaluations near the boundary of its search space. In this paper, we propose BOCK, Bayesian Optimization with Cylindrical Kernels, whose basic idea is to transform the ball geometry of the search space using a cylindrical transformation. Because of the transformed geometry, the Gaussian Process-based surrogate model spends less budget searching near the boundary, while concentrating its efforts relatively more near the center of the search region, where we expect the solution to be located. We evaluate BOCK extensively, showing that it is not only more accurate and efficient, but it also scales successfully to problems with a dimensionality as high as 500. We show that the better accuracy and scalability of BOCK even allows optimizing modestly sized neural network layers, as well as neural network hyperparameters.
Author Information
ChangYong Oh (University of Amsterdam)
Efstratios Gavves (University of Amsterdam)
Max Welling (University of Amsterdam)
Prof. Dr. Max Welling is a research chair in Machine Learning at the University of Amsterdam and a VP Technologies at Qualcomm. He has a secondary appointment as a senior fellow at the Canadian Institute for Advanced Research (CIFAR). He is co-founder of “Scyfer BV” a university spin-off in deep learning which got acquired by Qualcomm in summer 2017. In the past he held postdoctoral positions at Caltech (’98-’00), UCL (’00-’01) and the U. Toronto (’01-’03). He received his PhD in ’98 under supervision of Nobel laureate Prof. G. 't Hooft. Max Welling has served as associate editor in chief of IEEE TPAMI from 2011-2015 (impact factor 4.8). He serves on the board of the NIPS foundation since 2015 (the largest conference in machine learning) and has been program chair and general chair of NIPS in 2013 and 2014 respectively. He was also program chair of AISTATS in 2009 and ECCV in 2016 and general chair of MIDL 2018. He has served on the editorial boards of JMLR and JML and was an associate editor for Neurocomputing, JCGS and TPAMI. He received multiple grants from Google, Facebook, Yahoo, NSF, NIH, NWO and ONR-MURI among which an NSF career grant in 2005. He is recipient of the ECCV Koenderink Prize in 2010. Welling is in the board of the Data Science Research Center in Amsterdam, he directs the Amsterdam Machine Learning Lab (AMLAB), and co-directs the Qualcomm-UvA deep learning lab (QUVA) and the Bosch-UvA Deep Learning lab (DELTA). Max Welling has over 200 scientific publications in machine learning, computer vision, statistics and physics.
Related Events (a corresponding poster, oral, or spotlight)
-
2018 Poster: BOCK : Bayesian Optimization with Cylindrical Kernels »
Wed. Jul 11th 04:15 -- 07:00 PM Room Hall B #156
More from the Same Authors
-
2023 Workshop: Structured Probabilistic Inference and Generative Modeling »
Dinghuai Zhang · Yuanqi Du · Chenlin Meng · Shawn Tan · Yingzhen Li · Max Welling · Yoshua Bengio -
2022 Poster: Lie Point Symmetry Data Augmentation for Neural PDE Solvers »
Johannes Brandstetter · Max Welling · Daniel Worrall -
2022 Spotlight: Lie Point Symmetry Data Augmentation for Neural PDE Solvers »
Johannes Brandstetter · Max Welling · Daniel Worrall -
2021 Test Of Time: Test of Time Award »
Max Welling · Max Welling -
2021 Poster: A Practical Method for Constructing Equivariant Multilayer Perceptrons for Arbitrary Matrix Groups »
Marc Finzi · Max Welling · Andrew Wilson -
2021 Oral: A Practical Method for Constructing Equivariant Multilayer Perceptrons for Arbitrary Matrix Groups »
Marc Finzi · Max Welling · Andrew Wilson -
2020 Poster: Low Bias Low Variance Gradient Estimates for Hierarchical Boolean Stochastic Networks »
Adeel Pervez · Taco Cohen · Efstratios Gavves -
2019 Workshop: Learning and Reasoning with Graph-Structured Representations »
Ethan Fetaya · Zhiting Hu · Thomas Kipf · Yujia Li · Xiaodan Liang · Renjie Liao · Raquel Urtasun · Hao Wang · Max Welling · Eric Xing · Richard Zemel -
2019 Poster: Stochastic Beams and Where To Find Them: The Gumbel-Top-k Trick for Sampling Sequences Without Replacement »
Wouter Kool · Herke van Hoof · Max Welling -
2019 Oral: Stochastic Beams and Where To Find Them: The Gumbel-Top-k Trick for Sampling Sequences Without Replacement »
Wouter Kool · Herke van Hoof · Max Welling -
2018 Invited Talk: Intelligence per Kilowatthour »
Max Welling -
2017 Poster: Multiplicative Normalizing Flows for Variational Bayesian Neural Networks »
Christos Louizos · Max Welling -
2017 Talk: Multiplicative Normalizing Flows for Variational Bayesian Neural Networks »
Christos Louizos · Max Welling