Poster
in
Workshop: High-dimensional Learning Dynamics Workshop: The Emergence of Structure and Reasoning
Nonconvex Meta-optimization for Deep Learning
Xinyi Chen · Evan Dogariu · Zhou Lu · Elad Hazan
Abstract:
Hyperparameter tuning in mathematical optimization is a notoriously difficult problem. Recent tools from online control give rise to a provable methodology for hyperparameter tuning in convex optimization called meta-optimization. In this work, we extend this methodology to nonconvex optimization and the training of deep neural networks. We present an algorithm for nonconvex meta-optimization that leverages the reduction from nonconvex optimization to convex optimization, and investigate its applicability for deep learning tasks on academic-scale datasets.
Chat is not available.