Skip to yearly menu bar Skip to main content


A Framework for Bayesian Optimization in Embedded Subspaces

Amin Nayebi · Alexander Munteanu · Matthias Poloczek

Pacific Ballroom #236

Keywords: [ Optimization - Others ] [ Non-convex Optimization ] [ Gaussian Processes ] [ Dimensionality Reduction ] [ Bayesian Methods ]


We present a theoretically founded approach for high-dimensional Bayesian optimization based on low-dimensional subspace embeddings. We prove that the error in the Gaussian process model is bounded tightly when going from the original high-dimensional search domain to the low-dimensional embedding. This implies that the optimization process in the low-dimensional embedding proceeds essentially as if it were run directly on an unknown active subspace of low dimensionality. The argument applies to a large class of algorithms and GP models, including non-stationary kernels. Moreover, we provide an efficient implementation based on hashing and demonstrate empirically that this subspace embedding achieves considerably better results than the previously proposed methods for high-dimensional BO based on Gaussian matrix projections and structure-learning.

Live content is unavailable. Log in and register to view live content