Timezone: »

Multiplicative Noise and Heavy Tails in Stochastic Optimization
Liam Hodgkinson · Michael Mahoney

Tue Jul 20 07:40 AM -- 07:45 AM (PDT) @

Although stochastic optimization is central to modern machine learning, the precise mechanisms underlying its success, and in particular, the precise role of the stochasticity, still remain unclear. Modeling stochastic optimization algorithms as discrete random recurrence relations, we show that multiplicative noise, as it commonly arises due to variance in local rates of convergence, results in heavy-tailed stationary behaviour in the parameters. Theoretical results are obtained characterizing this for a large class of (non-linear and even non-convex) models and optimizers (including momentum, Adam, and stochastic Newton), demonstrating that this phenomenon holds generally. We describe dependence on key factors, including step size, batch size, and data variability, all of which exhibit similar qualitative behavior to recent empirical results on state-of-the-art neural network models. Furthermore, we empirically illustrate how multiplicative noise and heavy-tailed structure improve capacity for basin hopping and exploration of non-convex loss surfaces, over commonly-considered stochastic dynamics with only additive noise and light-tailed structure.

Author Information

Liam Hodgkinson (University of California Berkeley)
Michael Mahoney (UC Berkeley)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors