Oral
A Framework for Bayesian Optimization in Embedded Subspaces
Amin Nayebi · Alexander Munteanu · Matthias Poloczek

Tue Jun 11th 05:00 -- 05:05 PM @ Room 101

We present a theoretically founded approach for high-dimensional Bayesian optimization based on low-dimensional subspace embeddings. We prove that the error in the Gaussian process model is bounded tightly when going from the original high-dimensional search domain to the low-dimensional embedding. This implies that the optimization process in the low-dimensional embedding proceeds essentially as if it were run directly on the unknown active subspace. The argument applies to a large class of algorithms and GP mod- els, including non-stationary kernels. Moreover, we provide an efficient implementation based on hashing and demonstrate empirically that this sub- space embedding achieves considerably better results than the previously proposed methods for high-dimensional BO based on Gaussian matrix projections and structure-learning.

Author Information

Amin Nayebi (University of Arizona)
Alexander Munteanu (TU Dortmund)
Matthias Poloczek (Uber AI Labs & The University of Arizona)

**Matthias Poloczek** leads the Bayesian optimization team at Uber AI. He works in **machine learning and optimization**, with a focus on **Bayesian optimization** of expensive functions and its applications in aerospace engineering, biochemistry, and materials science. Before joining Uber AI, Matthias was an assistant professor in the Department of Systems and Industrial Engineering at the **University of Arizona** and a postdoctoral researcher with *David P. Williamson* and *Peter I. Frazier* at **Cornell University** , after he obtained his Ph.D. from the **Goethe-University** Frankfurt in 2013, advised by *Georg Schnitger*.

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors