Timezone: »

Optimizing Hyperparameters with Conformal Quantile Regression
David Salinas · Jacek Golebiowski · Aaron Klein · Matthias Seeger · Cedric Archambeau

Tue Jul 25 02:00 PM -- 04:30 PM (PDT) @ Exhibit Hall 1 #112

Many state-of-the-art hyperparameter optimization (HPO) algorithms rely on model-based optimizers that learn surrogate models of the target function to guide the search. Gaussian processes are the de facto surrogate model due to their ability to capture uncertainty. However, they make strong assumptions about the observation noise, which might not be warranted in practice. In this work, we propose to leverage conformalized quantile regression which makes minimal assumptions about the observation noise and, as a result, models the target function in a more realistic and robust fashion which translates to quicker HPO convergence on empirical benchmarks. To apply our method in a multi-fidelity setting, we propose a simple, yet effective, technique that aggregates observed results across different resource levels and outperforms conventional methods across many empirical tasks.

Author Information

David Salinas (Amazon AWS)
Jacek Golebiowski (Amazon Web Services)
Aaron Klein (AWS Berlin)
Matthias Seeger (Amazon Research)

Matthias W. Seeger received a Ph.D. from the School of Informatics, Edinburgh university, UK, in 2003 (advisor Christopher Williams). He was a research fellow with Michael Jordan and Peter Bartlett, University of California at Berkeley, from 2003, and with Bernhard Schoelkopf, Max Planck Institute for Intelligent Systems, Tuebingen, Germany, from 2005. He led a research group at the University of Saarbruecken, Germany, from 2008, and was assistant professor at the Ecole Polytechnique Federale de Lausanne from fall 2010. He joined Amazon as machine learning scientist in 2014.

Cedric Archambeau (Amazon Web Services)

More from the Same Authors