While both neural architecture search (NAS) and hyperparameter optimization (HPO) have been studied extensively in recent years, NAS methods typically assume fixed hyperparameters and vice versa. Furthermore, NAS has recently often been framed as a multi-objective optimization problem, in order to take, e.g., resource requirements into account. In this paper, we propose a set of methods that extend current approaches to jointly optimize neural architectures and hyperparameters with respect to multiple objectives. We hope that these methods will serve as simple baselines for future research on multi-objective joint NAS + HPO.
Thomas Elsken (Bosch Center for AI)
Difan Deng (Institut für Informationsverarbeitung, Leibniz Universität Hannover)
More from the Same Authors
2021 : Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization »
Sergio Izquierdo · Julia Guerrero-Viu · Sven Hauns · Guilherme Miotto · Simon Schrodi · André Biedenkapp · Thomas Elsken · Difan Deng · Marius Lindauer · Frank Hutter