Poster
Neural Networks are Convex Regularizers: Exact Polynomial-time Convex Optimization Formulations for Two-layer Networks
Mert Pilanci · Tolga Ergen
Keywords: [ Computational Learning Theory ] [ Convex Optimization ] [ Non-convex Optimization ] [ Sparsity and Compressed Sensing ] [ Optimization - Convex ]
Abstract:
We develop exact representations of two-layer neural networks with rectified linear units in terms of a single convex program with number of variables polynomial in the number of training samples and number of hidden neurons. Our theory utilizes semi-infinite duality and minimum norm regularization. Moreover, we show that certain standard convolutional linear networks are equivalent to $\ell_1$ regularized linear models in a polynomial sized discrete Fourier feature space.
Chat is not available.