Skip to yearly menu bar Skip to main content


Spotlight Poster

Multi-Track Message Passing: Tackling Oversmoothing and Oversquashing in Graph Learning via Preventing Heterophily Mixing

Hongbin Pei · Yu Li · Huiqi Deng · Jingxin Hai · Pinghui Wang · Jie Ma · Jing Tao · Yuheng Xiong · Xiaohong Guan

Hall C 4-9 #2301
[ ] [ Paper PDF ]
[ Slides
Thu 25 Jul 4:30 a.m. PDT — 6 a.m. PDT

Abstract: The advancement toward deeper graph neural networks is currently obscured by two inherent issues in message passing, *oversmoothing* and *oversquashing*. We identify the root cause of these issues as information loss due to *heterophily mixing* in aggregation, where messages of diverse category semantics are mixed. We propose a novel multi-track graph convolutional network to address oversmoothing and oversquashing effectively. Our basic idea is intuitive: if messages are separated and independently propagated according to their category semantics, heterophilic mixing can be prevented. Consequently, we present a novel multi-track message passing scheme capable of preventing heterophilic mixing, enhancing long-distance information flow, and improving separation condition. Empirical validations show that our model achieved state-of-the-art performance on several graph datasets and effectively tackled oversmoothing and oversquashing, setting a new benchmark of $86.4$% accuracy on Cora.

Chat is not available.