Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Neural Compression: From Information Theory to Applications

Neural Image Compression: Generalization, Robustness, and Spectral Biases

Kelsey Lieberman · James Diffenderfer · Charles Godfrey · Bhavya Kailkhura


Abstract:

Recent neural image compression (NIC) advances have produced models which are starting to outperform traditional codecs. While this has led to growing excitement about using NIC in real-world applications, the successful adoption of any machine learning system in the wild requires it to generalize (and be robust) to unseen distribution shifts at deployment. Unfortunately, current research lacks comprehensive datasets and informative tools to evaluate and understand NIC performance in real-world settings. To bridge this crucial gap, we provide a comprehensive benchmark suite to evaluate the out-of-distribution (OOD) performance of image compression methods and propose spectrally inspired inspection tools to gain deeper insight into errors introduced by image compression methods as well as their OOD performance. We then carry out a detailed performance comparison of a classical codec with NIC variants, revealing intriguing findings that challenge our current understanding of NIC.

Chat is not available.