Skip to yearly menu bar Skip to main content


G$^2$CN: Graph Gaussian Convolution Networks with Concentrated Graph Filters

Mingjie Li · Xiaojun Guo · Yifei Wang · Yisen Wang · Zhouchen Lin

Hall E #410

Keywords: [ DL: Other Representation Learning ] [ Deep Learning ] [ DL: Graph Neural Networks ]


Recently, linear GCNs have shown competitive performance against non-linear ones with less computation cost, and the key lies in their propagation layers. Spectral analysis has been widely adopted in designing and analyzing existing graph propagations. Nevertheless, we notice that existing spectral analysis fails to explain why existing graph propagations with the same global tendency, such as low-pass or high-pass, still yield very different results. Motivated by this situation, we develop a new framework for spectral analysis in this paper called concentration analysis. In particular, we propose three attributes: concentration centre, maximum response, and bandwidth for our analysis. Through a dissection of the limitations of existing graph propagations via the above analysis, we propose a new kind of propagation layer, Graph Gaussian Convolution Networks (G^2CN), in which the three properties are decoupled and the whole structure becomes more flexible and applicable to different kinds of graphs. Extensive experiments show that we can obtain state-of-the-art performance on heterophily and homophily datasets with our proposed G^2CN.

Chat is not available.