Temporal Weighted Encoding: Towards Maximal-Capacity Spike Coding for ANN–SNN Conversion
Abstract
Spiking Neural Networks (SNNs) emulate the spiking behavior of biological neurons and are promising for energy-efficient neuromorphic computing. A widely used strategy to train SNNs is to convert pretrained Artificial Neural Networks (ANNs), where the accuracy and efficiency are determined by the spike encoding scheme. Traditional methods based on spike count or timing severely underutilize the available encoding space, leading to large accuracy degradation under low-timestep constraints. More expressive alternatives involve complex dynamics, which hinder scalability and practical deployment. To address these challenges, we propose Temporal Weighted Encoding (TWE). Through a simple recursive integration, spikes are implicitly assigned exponentially decaying weights, drawing an analogy to a temporal bit sequence. We systematically analyze the temporal mismatch caused by this weight pattern and propose temporal relaxation and threshold relaxation to resolve this issue, enabling fast and accurate activation encoding. Extensive experiments demonstrate that TWE achieves negligible conversion loss with significantly fewer timesteps, offering a scalable and efficient solution for SNN deployment.