MC-HNN: Learning Latent Structural Semantics and High-Rank Representations for Hypergraph Neural Networks
Abstract
Hypergraph Neural Networks (HNNs) have emerged as powerful tools for modeling complex high-order correlations. Most existing HNNs adhere to a two-stage message passing paradigm, where node feature propagation is mediated by hyperedges. In this paper, we identify two fundamental theoretical limitations inherent to this paradigm, which we term rank collapse and hyperedge semantic dependency. To address these challenges, we propose the Multi-Channel Hypergraph Neural Network (MC-HNN). We design a multi-channel message passing mechanism to maintain high-rank representations, while simultaneously introducing a latent hyperedge type encoding mechanism to inject an independent degree of freedom into hyperedge representations. Both theoretical insights and empirical experiments demonstrate that MC-HNN effectively mitigates the limitations of the prevailing paradigm and achieves superior performance.