Talk
Approximate Steepest Coordinate Descent
Sebastian Stich · Anant Raj · Martin Jaggi
Parkside 2
[
Abstract
]
Abstract:
We propose a new selection rule for the coordinate selection in coordinate descent methods for huge-scale optimization. The efficiency of this novel scheme is provably better than the efficiency of uniformly random selection, and can reach the efficiency of steepest coordinate descent (SCD), enabling an acceleration of a factor of up to $n$, the number of coordinates. In many practical applications, our scheme can be implemented at no extra cost and computational efficiency very close to the faster uniform selection.
Numerical experiments with Lasso and Ridge regression show promising improvements, in line with our theoretical guarantees.
Live content is unavailable. Log in and register to view live content