Timezone: »

Almost surely constrained convex optimization
Olivier Fercoq · Ahmet Alacaoglu · Ion Necoara · Volkan Cevher

Wed Jun 12 06:30 PM -- 09:00 PM (PDT) @ Pacific Ballroom #101
We propose a stochastic gradient framework for solving stochastic composite convex optimization problems with (possibly) infinite number of linear inclusion constraints that need to be satisfied almost surely. We use smoothing and homotopy techniques to handle constraints without the need for matrix-valued projections. We show for our stochastic gradient algorithm $\mathcal{O}(\log(k)/\sqrt{k})$ convergence rate for general convex objectives and $\mathcal{O}(\log(k)/k)$ convergence rate for restricted strongly convex objectives. These rates are known to be optimal up to logarithmic factor, even without constraints. We conduct numerical experiments on basis pursuit, hard margin support vector machines and portfolio optimization problems and show that our algorithm achieves state-of-the-art practical performance.

Author Information

Olivier Fercoq (Télécom ParisTech, Université Paris-Saclay)
Ahmet Alacaoglu (EPFL)
Ion Necoara (University Bucharest)
Volkan Cevher (EPFL)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors