SoftBinary Coding: A New Information-Theoretic Paradigm for Neural Compression via Fast Channel Simulation
Abstract
Neural compression is currently dominated by Nonlinear Transform Coding (NTC), which maps data to real-valued latents via continuous transforms. Despite its success, NTC suffers from train-test mismatch due to non-differentiable quantization, a ''smoothness bias'' inherent in continuous transforms that precludes optimality for certain sources, and a loss of ''shaping gain" due to the complexity of including high-dimensional vector quantization. We propose SoftBinary Coding (SBC), an end-to-end learning paradigm that bypasses these limitations by using a stochastic binary latent space. In the spirit of vector quantization, SBC employs discrete representations and compresses them through a novel fast binary channel simulation scheme, for which we provide a proof of rate optimality. Experimental gains on information-theoretic sources provide both theoretical and practical closure to NTC's limitations, establishing discrete binary structures as a viable path toward reaching optimal rate--distortion bounds. Surprisingly, SBC also achieves state-of-the-art performance on vector quantization of i.i.d. sources, exceeding Trellis Coded Quantization of the Gaussian source.