Timezone: »

Stochastic Training of Graph Convolutional Networks with Variance Reduction
Jianfei Chen · Jun Zhu · Le Song

Wed Jul 11 07:00 AM -- 07:20 AM (PDT) @ A5

Graph convolutional networks (GCNs) are powerful deep neural networks for graph-structured data. However, GCN computes the representation of a node recursively from its neighbors, making the receptive field size grow exponentially with the number of layers. Previous attempts on reducing the receptive field size by subsampling neighbors do not have convergence guarantee, and their receptive field size per node is still in the order of hundreds. In this paper, we develop control variate based algorithms with new theoretical guarantee to converge to a local optimum of GCN regardless of the neighbor sampling size.Empirical results show that our algorithms enjoy similar convergence rate and model quality with the exact algorithm using only two neighbors per node. The running time of our algorithms on a large Reddit dataset is only one seventh of previous neighbor sampling algorithms.

Author Information

Jianfei Chen (Tsinghua University)
Jun Zhu (Tsinghua University)
Le Song (Georgia Institute of Technology)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors