High-Fidelity ANN-to-SNN Conversion via Closed-Loop CKA Distillation
Bozhou Li ⋅ Chubo Liu ⋅ Yan Ding ⋅ Yufeng Zhang ⋅ Zhuo Tang ⋅ Kenli Li
Abstract
ANN-to-SNN conversion offers energy-efficient inference but faces a fidelity-latency trade-off due to open-loop error accumulation. While conversion-aware training mitigates this, it sacrifices the generality of using off-the-shelf ANNs. We propose a closed-loop fine-tuning framework that calibrates these errors without altering the source model. Our approach employs a Dual Alignment Mechanism, utilizing global Kullback-Leibler divergence for output distillation and introducing an adaptive local Centered Kernel Alignment constraint, weighted by initial conversion loss, for feature alignment. We uncover a critical time-dependent dynamic: local constraints are essential for stabilizing representations in low-latency regimes (e.g., $T=8$) where global gradients are unstable, whereas global alignment drives fidelity at higher time steps. Experiments on CIFAR-10 demonstrate that our method achieves over 99\% of source ANN accuracy at $T=32$ (e.g., ResNet-18: 96.38\% vs.\ 96.39\%). Furthermore, this fine-tuning acts as a regularizer, yielding SNNs with input noise robustness that matches or exceeds the source ANN.
Successful Page Load