Timezone: »

 
Poster
Dropout: Explicit Forms and Capacity Control
Raman Arora · Peter Bartlett · Poorya Mianjy · Nati Srebro

Wed Jul 21 09:00 PM -- 11:00 PM (PDT) @ Virtual

We investigate the capacity control provided by dropout in various machine learning problems. First, we study dropout for matrix completion, where it induces a distribution-dependent regularizer that equals the weighted trace-norm of the product of the factors. In deep learning, we show that the distribution-dependent regularizer due to dropout directly controls the Rademacher complexity of the underlying class of deep neural networks. These developments enable us to give concrete generalization error bounds for the dropout algorithm in both matrix completion as well as training deep neural networks.

Author Information

Raman Arora (Johns Hopkins University)
Raman Arora

Raman Arora received his M.S. and Ph.D. degrees in Electrical and Computer Engineering from the University of Wisconsin-Madison in 2005 and 2009, respectively. From 2009-2011, he was a Postdoctoral Research Associate at the University of Washington in Seattle and a Visiting Researcher at Microsoft Research Redmond. Since 2011, he has been with Toyota Technological Institute at Chicago (TTIC). His research interests include machine learning, speech recognition and statistical signal processing.

Peter Bartlett ("University of California, Berkeley")
Poorya Mianjy (Johns Hopkins University)
Nati Srebro (Toyota Technological Institute at Chicago)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors