Oral
Characterizing Implicit Bias in Terms of Optimization Geometry
Suriya Gunasekar · Jason Lee · Daniel Soudry · Nati Srebro
Abstract:
We study the bias of generic optimization methods, including MirrorDescent, Natural Gradient Descent and Steepest Descent with respectto different potentials and norms, when optimizing underdeterminedlinear models or separable linear classification problems. We askthe question of whether the global minimum (among the many possibleglobal minima) reached by optimization can be characterized in termsof the potential or norm, and indecently of hyper-parameter choices,such as stepsize and momentum.
Chat is not available.
Successful Page Load