Fedfit: Federated dynamic pruning via Fisher Information scoring
Abstract
Cross-device Federated Learning (FL) is frequently bottlenecked by the prohibitive computational and communication costs of training deep neural networks on resource-constrained edge hardware. While federated dynamic pruning aims to alleviate these costs by adjusting sparse topologies during training, existing methods rely on magnitude-based heuristics that are fundamentally ill-suited for the non-convergent, heterogeneous environments inherent to FL. To address this challenge, we propose Fedfit, a federated dynamic framework that replaces simple heuristics with optimization-centric criteria for topology adjustment. By leveraging a second-order approximation of the loss landscape via the Fisher Information Matrix, Fedfit enables precise and efficient topology adjustment without the overhead of explicit Hessian computation. Empirical evaluations across computer vision and natural language processing benchmarks demonstrate that Fedfit significantly narrows the sparse-to-dense accuracy gap, outperforming state-of-the-art methods while maintaining high communication efficiency.