Skip to yearly menu bar Skip to main content


Poster

A Quantitative Analysis of the Effect of Batch Normalization on Gradient Descent

YongQiang Cai · Qianxiao Li · Zuowei Shen

Pacific Ballroom #54

Keywords: [ Algorithms ] [ Convex Optimization ] [ Non-convex Optimization ] [ Optimization ]


Abstract:

Despite its empirical success and recent theoretical progress, there generally lacks a quantitative analysis of the effect of batch normalization (BN) on the convergence and stability of gradient descent. In this paper, we provide such an analysis on the simple problem of ordinary least squares (OLS), where the precise dynamical properties of gradient descent (GD) is completely known, thus allowing us to isolate and compare the additional effects of BN. More precisely, we show that unlike GD, gradient descent with BN (BNGD) converges for arbitrary learning rates for the weights, and the convergence remains linear under mild conditions. Moreover, we quantify two different sources of acceleration of BNGD over GD -- one due to over-parameterization which improves the effective condition number and another due having a large range of learning rates giving rise to fast descent. These phenomena set BNGD apart from GD and could account for much of its robustness properties. These findings are confirmed quantitatively by numerical experiments, which further show that many of the uncovered properties of BNGD in OLS are also observed qualitatively in more complex supervised learning problems.

Live content is unavailable. Log in and register to view live content