Skip to yearly menu bar Skip to main content


( events)   Timezone: America/Los_Angeles  
Poster
Mon Aug 07 01:30 AM -- 05:00 AM (PDT) @ Gallery #93
Re-revisiting Learning on Hypergraphs: Confidence Interval and Subgradient Method
Chenzi Zhang · Shuguang Hu · Zhihao Gavin Tang · Hubert Chan

We revisit semi-supervised learning on hypergraphs. Same as previous approaches, our method uses a convex program whose objective function is not everywhere differentiable. We exploit the non-uniqueness of the optimal solutions, and consider confidence intervals which give the exact ranges that unlabeled vertices take in any optimal solution. Moreover, we give a much simpler approach for solving the convex program based on the subgradient method. Our experiments on real-world datasets confirm that our confidence interval approach on hypergraphs outperforms existing methods, and our sub-gradient method gives faster running times when the number of vertices is much larger than the number of edges.