We expose a structure in deep classifying neural networks in the derivative of the logits with respect to the parameters of the model, which is used to explain the existence of outliers in the spectrum of the Hessian. Previous works decomposed the Hessian into two components, attributing the outliers to one of them, the so-called Covariance of gradients. We show this term is not a Covariance but a second moment matrix, i.e., it is influenced by means of gradients. These means possess an additive two-way structure that is the source of the outliers in the spectrum. This structure can be used to approximate the principal subspace of the Hessian using certain "averaging" operations, avoiding the need for high-dimensional eigenanalysis. We corroborate this claim across different datasets, architectures and sample sizes.
Vardan Papyan (Stanford University)
Related Events (a corresponding poster, oral, or spotlight)
2019 Oral: Measurements of Three-Level Hierarchical Structure in the Outliers in the Spectrum of Deepnet Hessians »
Thu Jun 13th 12:10 -- 12:15 PM Room Grand Ballroom