We revisit semi-supervised learning on hypergraphs. Same as previous approaches, our method uses a convex program whose objective function is not everywhere differentiable. We exploit the non-uniqueness of the optimal solutions, and consider confidence intervals which give the exact ranges that unlabeled vertices take in any optimal solution. Moreover, we give a much simpler approach for solving the convex program based on the subgradient method. Our experiments on real-world datasets confirm that our confidence interval approach on hypergraphs outperforms existing methods, and our sub-gradient method gives faster running times when the number of vertices is much larger than the number of edges.
Chenzi Zhang (HKU)
Shuguang Hu (University of Hong Kong)
Zhihao Gavin Tang (University of Hong Kong)
Hubert Chan (University of Hong Kong)
Related Events (a corresponding poster, oral, or spotlight)
2017 Talk: Re-revisiting Learning on Hypergraphs: Confidence Interval and Subgradient Method »
Mon Aug 7th 05:30 -- 05:48 AM Room C4.6 & C4.7