Learning on Higher-Order Structures with Effective Operators
Abstract
Higher-order structures are powerful relational modeling tools, yet existing spectral operators decompose topology into separate ranks, leaving practitioners to fuse information back to vertices through ad-hoc choices. We introduce Collapsed Effective Operators, which marginalize higher-order structures into a single vertex-level operator via Schur complementation of a graded Laplacian. This yields a dense operator that encodes long-range interactions mediated by topology and is applicable to arbitrary higher-order constructs. We show it preserves positive semi-definiteness with a strict spectral upper bound relative to the rank-0 Laplacian, effectively lowering system energy under higher-order connectivity. Empirically, our operator significantly improves spectral clustering, enables diffusion over topological structures, and accelerates the processing of higher-order structures with neural networks.