Abstract:
We develop exact representations of two-layer neural networks with rectified linear units in terms of a single convex program with number of variables polynomial in the number of training samples and number of hidden neurons. Our theory utilizes semi-infinite duality and minimum norm regularization. Moreover, we show that certain standard convolutional linear networks are equivalent to $\ell_1$ regularized linear models in a polynomial sized discrete Fourier feature space.