Skip to yearly menu bar Skip to main content


( events)   Timezone:  
Oral
Fri Jul 13 02:20 AM -- 02:40 AM (PDT) @ K1
Gradient descent with identity initialization efficiently learns positive definite linear transformations by deep residual networks
Peter Bartlett · Dave Helmbold · Phil Long
[ PDF [ Video
We analyze algorithms for approximatinga function$f(x) = \Phi x$mapping $\Re^d$ to $\Re^d$ using deep linearneural networks, i.e.\ that learn a function $h$ parameterizedby matrices $\Theta_1,...,\Theta_L$ and defined by$h(x) = \Theta_L \Theta_{L-1} ... \Theta_1 x$. We focuson algorithms that learn through gradient descent on the populationquadratic loss in the case that the distribution over the inputs isisotropic. We provide polynomial bounds on the number ofiterations for gradient descent to approximate theleast squares matrix $\Phi$, in the case wherethe initial hypothesis $\Theta_1 = ... = \Theta_L = I$ has excess loss bounded by a small enoughconstant. On the other hand,we show that gradient descent fails to converge for$\Phi$ whose distance from the identityis a larger constant, and we show that some formsof regularization toward the identity in each layer donot help. If $\Phi$ is symmetric positive definite,we show that an algorithm that initializes $\Theta_i = I$learns an $\epsilon$-approximation of $f$ using a number of updates polynomial in $L$,the condition number of $\Phi$, and $\log(d/\epsilon)$. In contrast, we showthat if the least squares matrix $\Phi$ is symmetric and has anegative eigenvalue, then all members of a class of algorithmsthat perform gradient descent with identity initialization,and optionally regularize toward the identity in each layer, fail toconverge. We analyze an algorithm for the case that $\Phi$ satisfies $u^{\top}\Phi u > 0$ for all $u$, but may not be symmetric. This algorithm usestwo regularizers: one that maintains the invariant $u^{\top} \Theta_L\Theta_{L-1} ... \Theta_1 u > 0$ for all $u$, and another that ``balances''$\Theta_1, ..., \Theta_L$ so that they have the same singular values.