Lexicographic and Depth-Sensitive Margins in Homogeneous and Non-Homogeneous Deep Models
Mor Shpigel Nacson · Suriya Gunasekar · Jason Lee · Nati Srebro · Daniel Soudry

Tue Jun 11th 02:20 -- 02:25 PM @ Grand Ballroom

With an eye toward understanding complexity control in deep learning, we study how infinitesimal regularization or gradient descent optimization lead to margin maximizing solutions in both homogeneous and non homogeneous models, extending previous work that focused on infinitesimal regularization only in homogeneous models. To this end we study the limit of loss minimization with a diverging norm constraint (the constrained path''), relate it to the limit of amargin path'' and characterize the resulting solution. For non-homogeneous models we show that this solution is biased toward the deepest part of the model, discarding the shallowest parts if they are unnecessary. For homogeneous models, we show convergence to a ``lexicographic max margin solution'', and provide conditions under which max margin solutions are also attained as the limit of unconstrained gradient descent.

Author Information

Mor Shpigel Nacson (Technion)
Suriya Gunasekar (Toyota Technological Institute at Chicago)
Jason Lee (University of Southern California)
Nati Srebro (Toyota Technological Institute at Chicago)
Daniel Soudry (Technion)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors