Timezone: »

A Framework for Bayesian Optimization in Embedded Subspaces
Amin Nayebi · Alexander Munteanu · Matthias Poloczek

Tue Jun 11 06:30 PM -- 09:00 PM (PDT) @ Pacific Ballroom #236

We present a theoretically founded approach for high-dimensional Bayesian optimization based on low-dimensional subspace embeddings. We prove that the error in the Gaussian process model is bounded tightly when going from the original high-dimensional search domain to the low-dimensional embedding. This implies that the optimization process in the low-dimensional embedding proceeds essentially as if it were run directly on an unknown active subspace of low dimensionality. The argument applies to a large class of algorithms and GP models, including non-stationary kernels. Moreover, we provide an efficient implementation based on hashing and demonstrate empirically that this subspace embedding achieves considerably better results than the previously proposed methods for high-dimensional BO based on Gaussian matrix projections and structure-learning.

Author Information

Amin Nayebi (University of Arizona)
Alexander Munteanu (TU Dortmund)
Matthias Poloczek (Uber AI Labs & The University of Arizona)

**Matthias Poloczek** is a Principal Scientist at Amazon. He works in **machine learning and optimization** with applications in Robotics, Advertisement, Recommender Systems and more. Moreover, Matthias leads large-scale initiative on ML Operational Excellence. Previously, Matthias was a Senior Manager at Uber AI where he led the AutoML initiative and drove impactful solutions based on Bayesian optimization.

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors