Timezone: »

 
Poster
On Dropout and Nuclear Norm Regularization
Poorya Mianjy · Raman Arora

Tue Jun 11 06:30 PM -- 09:00 PM (PDT) @ Pacific Ballroom #79
We give a formal and complete characterization of the explicit regularizer induced by dropout in deep linear networks with squared loss. We show that (a) the explicit regularizer is composed of an $\ell_2$-path regularizer and other terms that are also re-scaling invariant, (b) the convex envelope of the induced regularizer is the squared nuclear norm of the network map, and (c) for a sufficiently large dropout rate, we characterize the global optima of the dropout objective. We validate our theoretical findings with empirical results.

Author Information

Poorya Mianjy (Johns Hopkins University)
Raman Arora (Johns Hopkins University)
Raman Arora

Raman Arora received his M.S. and Ph.D. degrees in Electrical and Computer Engineering from the University of Wisconsin-Madison in 2005 and 2009, respectively. From 2009-2011, he was a Postdoctoral Research Associate at the University of Washington in Seattle and a Visiting Researcher at Microsoft Research Redmond. Since 2011, he has been with Toyota Technological Institute at Chicago (TTIC). His research interests include machine learning, speech recognition and statistical signal processing.

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors