Skip to yearly menu bar Skip to main content


Poster

On the Weight Dynamics of Deep Normalized Networks

Christian H.X. Ali Mehmeti-Göpel · Michael Wand


Abstract:

Recent studies have shown that high disparities in effective learning rates (ELRs) across layers in deep neural networks can negatively affect trainability. We formalize how these disparities evolve over time this by modeling weight dynamics (evolution of expected gradient and weight norms) of networks with normalization layers, predicting the evolution of layer-wise ELR ratios. We prove that that when training with any constant learning rate, ELR ratios converge to 1, despite initial gradient explosion. We identify a 'critical learning rate' beyond which ELR disparities widen, which only depends on current ELRs. To validate our findings, we devise a hyper-parameter-free warm-up method that successfully minimizes ELR spread quickly in theory and practice. Our experiments link ELR spread with trainability, a relationship that is most evident in very deep networks with significant gradient magnitude excursions.

Live content is unavailable. Log in and register to view live content