Timezone: »
Poster
Spurious Local Minima are Common in Two-Layer ReLU Neural Networks
Itay Safran · Ohad Shamir
We consider the optimization problem associated with training simple ReLU neural networks of the form $\mathbf{x}\mapsto \sum_{i=1}^{k}\max\{0,\mathbf{w}_i^\top \mathbf{x}\}$ with respect to the squared loss. We provide a computer-assisted proof that even if the input distribution is standard Gaussian, even if the dimension is arbitrarily large, and even if the target values are generated by such a network, with orthonormal parameter vectors, the problem can still have spurious local minima once $6\le k\le 20$. By a concentration of measure argument, this implies that in high input dimensions, \emph{nearly all} target networks of the relevant sizes lead to spurious local minima. Moreover, we conduct experiments which show that the probability of hitting such local minima is quite high, and increasing with the network size. On the positive side, mild over-parameterization appears to drastically reduce such local minima, indicating that an over-parameterization assumption is necessary to get a positive result in this setting.
Author Information
Itay Safran (Weizmann Institute of Science)
Ohad Shamir (Weizmann Institute of Science)
Related Events (a corresponding poster, oral, or spotlight)
-
2018 Oral: Spurious Local Minima are Common in Two-Layer ReLU Neural Networks »
Fri Jul 13th 09:00 -- 09:20 AM Room K1
More from the Same Authors
-
2020 Poster: The Complexity of Finding Stationary Points with Stochastic Gradient Descent »
Yoel Drori · Ohad Shamir -
2020 Poster: Proving the Lottery Ticket Hypothesis: Pruning is All You Need »
Eran Malach · Gilad Yehudai · Shai Shalev-Schwartz · Ohad Shamir -
2020 Poster: Is Local SGD Better than Minibatch SGD? »
Blake Woodworth · Kumar Kshitij Patel · Sebastian Stich · Zhen Dai · Brian Bullins · Brendan McMahan · Ohad Shamir · Nati Srebro -
2017 Poster: Oracle Complexity of Second-Order Methods for Finite-Sum Problems »
Yossi Arjevani · Ohad Shamir -
2017 Poster: Online Learning with Local Permutations and Delayed Feedback »
Liran Szlak · Ohad Shamir -
2017 Poster: Communication-efficient Algorithms for Distributed Stochastic Principal Component Analysis »
Dan Garber · Ohad Shamir · Nati Srebro -
2017 Poster: Depth-Width Tradeoffs in Approximating Natural Functions With Neural Networks »
Itay Safran · Ohad Shamir -
2017 Poster: Failures of Gradient-Based Deep Learning »
Shaked Shammah · Shai Shalev-Shwartz · Ohad Shamir -
2017 Talk: Depth-Width Tradeoffs in Approximating Natural Functions With Neural Networks »
Itay Safran · Ohad Shamir -
2017 Talk: Failures of Gradient-Based Deep Learning »
Shaked Shammah · Shai Shalev-Shwartz · Ohad Shamir -
2017 Talk: Oracle Complexity of Second-Order Methods for Finite-Sum Problems »
Yossi Arjevani · Ohad Shamir -
2017 Talk: Online Learning with Local Permutations and Delayed Feedback »
Liran Szlak · Ohad Shamir -
2017 Talk: Communication-efficient Algorithms for Distributed Stochastic Principal Component Analysis »
Dan Garber · Ohad Shamir · Nati Srebro