Timezone: »

Characterization of Convex Objective Functions and Optimal Expected Convergence Rates for SGD
Marten van Dijk · Lam Nguyen · PHUONG_HA NGUYEN · Dzung Phan

Tue Jun 11 06:30 PM -- 09:00 PM (PDT) @ Pacific Ballroom #193

We study Stochastic Gradient Descent (SGD) with diminishing step sizes for convex objective functions. We introduce a definitional framework and theory that defines and characterizes a core property, called curvature, of convex objective functions. In terms of curvature we can derive a new inequality that can be used to compute an optimal sequence of diminishing step sizes by solving a differential equation. Our exact solutions confirm known results in literature and allows us to fully characterize a new regularizer with its corresponding expected convergence rates.

Author Information

Marten van Dijk (University of Connecticut)
Lam Nguyen (IBM Research, Thomas J. Watson Research Center)
PHUONG_HA NGUYEN (University of Connecticut)
Dzung Phan (IBM T.J. Watson Research Center)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors