Poster

Implicit Regularization with Polynomial Growth in Deep Tensor Factorization

Kais HARIZ · Hachem Kadri · Stephane Ayache · Maher Moakher · Thierry Artieres

Hall E #312

Keywords: [ DL: Theory ]

[ Abstract ]
[ Slides [ Poster [ Paper PDF
Wed 20 Jul 3:30 p.m. PDT — 5:30 p.m. PDT
 
Spotlight presentation: DL: Theory
Wed 20 Jul 7:30 a.m. PDT — 9 a.m. PDT

Abstract:

We study the implicit regularization effects of deep learning in tensor factorization. While implicit regularization in deep matrix and 'shallow' tensor factorization via linear and certain type of non-linear neural networks promotes low-rank solutions with at most quadratic growth, we show that its effect in deep tensor factorization grows polynomially with the depth of the network. This provides a remarkably faithful description of the observed experimental behaviour. Using numerical experiments, we demonstrate the benefits of this implicit regularization in yielding a more accurate estimation and better convergence properties.

Chat is not available.