Timezone: »
Typically, unordered image datasets are individually and sequentially compressed in random order. Unfortunately, general set compression methods that improve over the default sequential treatment yield only small rate gains for high-dimensional objects such as images. We propose an approach for compressing image datasets by using an image-to-image conditional generative model on a reordered dataset. Our approach is inspired by Associative Compression Networks (Graves et al., 2018). Even though this variation of variational auto-encoders was primarily developed for representation learning, the authors of the paper show substantial gains in the lossless compression of latent variables. We apply the core idea of the aforementioned work; adapting the generative prior to a previously seen neighbor image, to a commonly used neural compression model; the mean-scale hyperprior model (MSHP) (Ball ´e et al., 2018; Minnen et al., 2018). However, the architecture changes we propose here are applicable to other methods such as ELIC (He et al., 2022) as well. We train our model on subsets of an ordered version of Imagenet, and report rate-distortion curves on the same dataset. Unfortunately, we only see gains in latent space. Hence we speculate as to the reason why the approach is not leading to more significant improvements.
Author Information
Winnie Xu (University of Toronto)
Matthew Muckley (Meta AI)
Yann Dubois (Stanford University)
Karen Ullrich (Meta AI)
More from the Same Authors
-
2022 : Learning to Discretize for Continuous-time Sequence Compression »
Ricky T. Q. Chen · Maximilian Nickel · Matthew Le · Matthew Muckley · Karen Ullrich -
2023 Poster: Learning Instance-Specific Augmentations by Capturing Local Invariances »
Ning Miao · Tom Rainforth · Emile Mathieu · Yann Dubois · Yee-Whye Teh · Adam Foster · Hyunjik Kim -
2023 Oral: Evaluating Self-Supervised Learning via Risk Decomposition »
Yann Dubois · Tatsunori Hashimoto · Percy Liang -
2023 Poster: Evaluating Self-Supervised Learning via Risk Decomposition »
Yann Dubois · Tatsunori Hashimoto · Percy Liang -
2023 Poster: Deep Latent State Space Models for Time-Series Generation »
Linqi Zhou · Michael Poli · Winnie Xu · Stefano Massaroli · Stefano Ermon -
2023 Poster: Improving Statistical Fidelity for Neural Image Compression with Implicit Local Likelihood Models »
Matthew Muckley · Alaaeldin El-Nouby · Karen Ullrich · Herve Jegou · Jakob Verbeek