Skip to yearly menu bar Skip to main content


Poster

Proving the Lottery Ticket Hypothesis: Pruning is All You Need

Eran Malach · Gilad Yehudai · Shai Shalev-Schwartz · Ohad Shamir

Keywords: [ Deep Learning Theory ] [ Optimization ] [ Supervised Learning ] [ Deep Learning - Theory ]


Abstract:

The lottery ticket hypothesis (Frankle and Carbin, 2018), states that a randomly-initialized network contains a small subnetwork such that, when trained in isolation, can compete with the performance of the original network. We prove an even stronger hypothesis (as was also conjectured in Ramanujan et al., 2019), showing that for every bounded distribution and every target network with bounded weights, a sufficiently over-parameterized neural network with random weights contains a subnetwork with roughly the same accuracy as the target network, without any further training.

Chat is not available.