Timezone: »
We present a theoretically founded approach for high-dimensional Bayesian optimization based on low-dimensional subspace embeddings. We prove that the error in the Gaussian process model is bounded tightly when going from the original high-dimensional search domain to the low-dimensional embedding. This implies that the optimization process in the low-dimensional embedding proceeds essentially as if it were run directly on the unknown active subspace. The argument applies to a large class of algorithms and GP mod- els, including non-stationary kernels. Moreover, we provide an efficient implementation based on hashing and demonstrate empirically that this sub- space embedding achieves considerably better results than the previously proposed methods for high-dimensional BO based on Gaussian matrix projections and structure-learning.
Author Information
Amin Nayebi (University of Arizona)
Alexander Munteanu (TU Dortmund)
Matthias Poloczek (Uber AI Labs & The University of Arizona)
**Matthias Poloczek** leads the Bayesian optimization team at Uber AI. He works in **machine learning and optimization**, with a focus on **Bayesian optimization** of expensive functions and its applications in aerospace engineering, biochemistry, and materials science. Before joining Uber AI, Matthias was an assistant professor in the Department of Systems and Industrial Engineering at the **University of Arizona** and a postdoctoral researcher with *David P. Williamson* and *Peter I. Frazier* at **Cornell University** , after he obtained his Ph.D. from the **Goethe-University** Frankfurt in 2013, advised by *Georg Schnitger*.
Related Events (a corresponding poster, oral, or spotlight)
-
2019 Poster: A Framework for Bayesian Optimization in Embedded Subspaces »
Wed. Jun 12th 01:30 -- 04:00 AM Room Pacific Ballroom #236
More from the Same Authors
-
2022 Poster: Bounding the Width of Neural Networks via Coupled Initialization - A Worst Case Analysis »
Alexander Munteanu · Simon Omlor · Zhao Song · David Woodruff -
2022 Spotlight: Bounding the Width of Neural Networks via Coupled Initialization - A Worst Case Analysis »
Alexander Munteanu · Simon Omlor · Zhao Song · David Woodruff -
2021 Poster: Oblivious Sketching for Logistic Regression »
Alexander Munteanu · Simon Omlor · David Woodruff -
2021 Spotlight: Oblivious Sketching for Logistic Regression »
Alexander Munteanu · Simon Omlor · David Woodruff -
2018 Poster: Bayesian Optimization of Combinatorial Structures »
Ricardo Baptista · Matthias Poloczek -
2018 Oral: Bayesian Optimization of Combinatorial Structures »
Ricardo Baptista · Matthias Poloczek