Timezone: »

GraphFM: Improving Large-Scale GNN Training via Feature Momentum
Haiyang Yu · Limei Wang · Bokun Wang · Meng Liu · Tianbao Yang · Shuiwang Ji

Wed Jul 20 03:30 PM -- 05:30 PM (PDT) @ Hall E #423

Training of graph neural networks (GNNs) for large-scale node classification is challenging. A key difficulty lies in obtaining accurate hidden node representations while avoiding the neighborhood explosion problem. Here, we propose a new technique, named feature momentum (FM), that uses a momentum step to incorporate historical embeddings when updating feature representations. We develop two specific algorithms, known as GraphFM-IB and GraphFM-OB, that consider in-batch and out-of-batch data, respectively.GraphFM-IB applies FM to in-batch sampled data, while GraphFM-OB applies FM to out-of-batch data that are 1-hop neighborhood of in-batch data.We provide a convergence analysis for GraphFM-IB and some theoretical insight for GraphFM-OB. Empirically, we observe that GraphFM-IB can effectively alleviate the neighborhood explosion problem of existing methods. In addition, GraphFM-OB achieves promising performance on multiple large-scale graph datasets.

Author Information

Haiyang Yu (Texas A&M University)
Limei Wang (Texas A&M University)
Bokun Wang (The University of Iowa)
Meng Liu (Texas A&M University)
Tianbao Yang (The University of Iowa)
Shuiwang Ji (Texas A&M University)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors