Skip to yearly menu bar Skip to main content


Poster
in
Workshop: 2nd Annual Workshop on Topology, Algebra, and Geometry in Machine Learning (TAG-ML)

Breaking the Curse of Depth in Graph Convolutional Networks via Refined Initialization Strategy

Senmiao Wang · Yupeng Chen · Yushun Zhang · Tian Ding · Ruoyu Sun


Abstract:

Graph convolutional networks (GCNs) suffer from the curse of depth, a phenomenon where performance degrades significantly as network depth increases. While over-smoothing has been considered the primary cause of this issue, we discover that gradient vanishing or exploding under commonly-used initialization methods also contributes to the curse of depth. To this end, we propose to evaluate GCN initialization quality from three aspects: forward-propagation, backward-propagation, and output diversity. We theoretically prove that conventional initialization methods fail to simultaneously maintain reasonable forward propagation and output diversity. To tackle this problem, We develop a new GCN initialization method called Signal Propagation on Graph (SPoGInit). By carefully designing and optimizing initial weight metrics, SPoGInit effectively alleviates performance degradation in deep GCNs. We further introduce a new architecture termed ReZeroGCN, which simultaneously addresses the three aspects at initialization. This architecture achieves performance gains on node classification tasks when increasing the depth from 4 to 64, e.g., 10\% gain in training and 3\% gain in test accuracy on OGBN-Arxiv. To the best of our knowledge, this is the first result to fully resolve the curse of depth on OGBN-Arxiv over such a range of depths.

Chat is not available.