Timezone: »

 
Poster
Convexified Convolutional Neural Networks
Yuchen Zhang · Percy Liang · Martin Wainwright

Mon Aug 07 01:30 AM -- 05:00 AM (PDT) @ Gallery #101

We describe the class of convexified convolutional neural networks (CCNNs), which capture the parameter sharing of convolutional neural networks in a convex manner. By representing the nonlinear convolutional filters as vectors in a reproducing kernel Hilbert space, the CNN parameters can be represented as a low-rank matrix, which can be relaxed to obtain a convex optimization problem. For learning two-layer convolutional neural networks, we prove that the generalization error obtained by a convexified CNN converges to that of the best possible CNN. For learning deeper networks, we train CCNNs in a layer-wise manner. Empirically, CCNNs achieve competitive or better performance than CNNs trained by backpropagation, SVMs, fully-connected neural networks, stacked denoising auto-encoders, and other baseline methods.

Author Information

Yuchen Zhang (Stanford)
Percy Liang (Stanford University)
Martin Wainwright (University of California at Berkeley)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors