Timezone: »

On the Expressive Power of Deep Neural Networks
Maithra Raghu · Ben Poole · Surya Ganguli · Jon Kleinberg · Jascha Sohl-Dickstein

Mon Aug 07 01:30 AM -- 05:00 AM (PDT) @ Gallery #110

We propose a new approach to the problem of neural network expressivity, which seeks to characterize how structural properties of a neural network family affect the functions it is able to compute. Our approach is based on an interrelated set of measures of expressivity, unified by the novel notion of trajectory length, which measures how the output of a network changes as the input sweeps along a one-dimensional path. Our findings show that: (1) The complexity of the computed function grows exponentially with depth (2) All weights are not equal: trained networks are more sensitive to their lower (initial) layer weights (3) Trajectory regularization is a simpler alternative to batch normalization, with the same performance.

Author Information

Maithra Raghu (Google Brain / Cornell University)
Ben Poole (Stanford University)
Surya Ganguli (Stanford)
Jon Kleinberg (Cornell University)
Jascha Sohl-Dickstein (Google Brain)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors