Poster
A Quantitative Analysis of the Effect of Batch Normalization on Gradient Descent
YongQiang Cai · Qianxiao Li · Zuowei Shen

Tue Jun 11th 06:30 -- 09:00 PM @ Pacific Ballroom #54

Despite its empirical success and recent theoretical progress, there generally lacks a quantitative analysis of the effect of batch normalization (BN) on the convergence and stability of gradient descent. In this paper, we provide such an analysis on the simple problem of ordinary least squares (OLS), where the precise dynamical properties of gradient descent (GD) is completely known, thus allowing us to isolate and compare the additional effects of BN. More precisely, we show that unlike GD, gradient descent with BN (BNGD) converges for arbitrary learning rates for the weights, and the convergence remains linear under mild conditions. Moreover, we quantify two different sources of acceleration of BNGD over GD -- one due to over-parameterization which improves the effective condition number and another due having a large range of learning rates giving rise to fast descent. These phenomena set BNGD apart from GD and could account for much of its robustness properties. These findings are confirmed quantitatively by numerical experiments, which further show that many of the uncovered properties of BNGD in OLS are also observed qualitatively in more complex supervised learning problems.

Author Information

YongQiang Cai (National University of Singapore)
Qianxiao Li (National University of Singapore; IHPC, Singapore)
Zuowei Shen (National University of Singapore)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors