Timezone: »

Optimization-induced Implicit Graph Diffusion
Qi Chen · Yifei Wang · Yisen Wang · Jiansheng Yang · Zhouchen Lin

Tue Jul 19 11:50 AM -- 11:55 AM (PDT) @ None

Due to the over-smoothing issue, most existing graph neural networks (GNNs) can only capture limited dependencies with their inherently finite aggregation layers. To overcome this limitation, we propose a new kind of graph convolution, called Optimization-induced Implicit Graph Diffusion (OIGD), which implicitly has access to infinite hops of neighbors while adaptively aggregating features with nonlinear diffusion to prevent over-smoothing. Notably, we show that the learned representation can be formalized as the minimizer of an explicit convex optimization objective. With this property, we can theoretically characterize the equilibrium of our OIGD from an optimization perspective. More interestingly, we can induce new structural variants by modifying the corresponding optimization objective. To be specific, we can embed prior properties to the equilibrium, as well as introducing skip connections to promote training stability. Extensive experiments show that OIGD is good at capturing long-range dependencies, and performs well on both homophilic and heterophilic graphs with nonlinear diffusion. Moreover, we show that the optimization-induced variants of our models can boost the performance and improve training stability and efficiency as well. As a result, our OIGD obtains significant improvements on both node-level and graph-level tasks.

Author Information

Qi Chen (Peking University)
Yifei Wang (Peking University)
Yisen Wang (Peking University)
Jiansheng Yang (Peking University)
Zhouchen Lin (Peking University)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors