Timezone: »

Stochastic Iterative Hard Thresholding for Graph-structured Sparsity Optimization
Baojian Zhou · Feng Chen · Yiming Ying

Thu Jun 13 09:20 AM -- 09:25 AM (PDT) @ Room 104
Stochastic optimization algorithms update models with cheap per-iteration costs sequentially, which makes them amenable for large-scale data analysis. Such algorithms have been widely studied for structured sparse models where the sparsity information is very specific, e.g., convex sparsity-inducing norms or $\ell^0$-norm. However, these norms cannot be directly applied to the problem of the complex (non-convex) graph-structured sparsity models, which have important application in disease outbreak and social networks, etc. In this paper, we propose a stochastic gradient-based method for solving graph-structured sparsity constraint problems, not restricted to the least square loss. We prove that our algorithm enjoys linear convergence up to a constant error of competitiveness with the counterparts in the batch learning setting. We conduct extensive experiments to show the efficiency and effectiveness of the proposed algorithms. To the best of our knowledge, it is the first stochastic gradient-based method with theoretical convergence guarantees for graph-structured constrained optimization problems.

Author Information

Baojian Zhou (University at Albany, SUNY)
Feng Chen (University at albany SUNY)
Yiming Ying (SUNY Albany)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors