Timezone: »

 
Poster
On Bridging the Gap between Mean Field and Finite Width Deep Random Multilayer Perceptron with Batch Normalization
Amir Joudaki · Hadi Daneshmand · Francis Bach

Wed Jul 26 02:00 PM -- 03:30 PM (PDT) @ Exhibit Hall 1 #221

Mean-field theory is widely used in theoretical studies of neural networks. In this paper, we analyze the role of depth in the concentration of mean-field predictions for Gram matrices of hidden representations in deep multilayer perceptron (MLP) with batch normalization (BN) at initialization. It is postulated that the mean-field predictions suffer from layer-wise errors that amplify with depth. We demonstrate that BN avoids this error amplification with depth. When the chain of hidden representations is rapidly mixing, we establish a concentration bound for a mean-field model of Gram matrices. To our knowledge, this is the first concentration bound that does not become vacuous with depth for standard MLPs with a finite width.

Author Information

Amir Joudaki (Swiss Federal Institute of Technology)
Hadi Daneshmand (MIT)
Francis Bach (INRIA - Ecole Normale Supérieure)

More from the Same Authors