Federated Graph Learning via Structure-Aware Fusion Using a Kalman Framework with Learnable Dynamics
Abstract
Federated Graph Learning (FGL) enables collaborative training across distributed clients without sharing raw graph data. However, its performance is severely hindered by graph-specific heterogeneity arising from divergent node feature distributions and disparate graph structures. Existing FGL methods primarily focus on aligning or personalizing node features but largely overlook the role of structural knowledge, leading to aggregation-induced representation drift during message passing. We observe that structural heterogeneity often originates from feature-driven connection biases shaped by local data collection practices or user preferences. To address this, we propose \textbf{Fed-Kalter}, a novel FGL framework that integrates Kalman filtering principles into graph neural networks. Fed-Kalter introduces Kalter-Conv, a graph convolution grounded in a Kalman framework with learnable dynamics, which treats structural embeddings as latent states and feature-augmented neighborhoods as noisy observations, thereby filtering feature-induced structural noise in a layer-wise manner. Only structural parameters are aggregated globally, enabling effective cross-client knowledge transfer while preserving local personalization. Extensive experiments on 16 graph classification datasets spanning 4 domains demonstrate that Fed-Kalter consistently outperforms state-of-the-art FGL methods. Further ablation and hyperparameter studies confirm its robustness, efficiency, and effectiveness in mitigating structural heterogeneity.