Timezone: »
Neural compression offers a domain-agnostic approach to creating codecs for lossy or lossless compression via deep generative models. For sequence compression, however, most deep sequence models have costs that scale with the sequence length rather than the sequence complexity. In this work, we instead treat data sequences as observations from an underlying continuous-time process and learn how to efficiently discretize while retaining information about the full sequence. As a consequence of decoupling sequential information from its temporal discretization, our approach allows for greater compression rates and smaller computational complexity. Moreover, the continuous-time approach naturally allows us to decode at different time intervals and is amenable to randomly missing data, an important property for streaming applications. We empirically verify our approach on multiple domains involving compression of video and motion capture sequences, showing that our approaches can automatically achieve significant reductions in bit rates.
Author Information
Ricky T. Q. Chen (Facebook AI Research)
Maximilian Nickel (Meta AI)
Matthew Le (Facebook AI Research)
Matthew Muckley (Facebook AI Research)
Karen Ullrich (Facebook AI Research)
More from the Same Authors
-
2022 : P24: Unifying Generative Models with GFlowNets »
Dinghuai Zhang · Ricky T. Q. Chen -
2023 : Revisiting Associative Compression: I Can't Believe It's Not Better »
Winnie Xu · Matthew Muckley · Yann Dubois · Karen Ullrich -
2023 Poster: Improving Statistical Fidelity for Neural Image Compression with Implicit Local Likelihood Models »
Matthew Muckley · Alaaeldin El-Nouby · Karen Ullrich · Herve Jegou · Jakob Verbeek -
2022 Poster: Matching Normalizing Flows and Probability Paths on Manifolds »
Heli Ben-Hamu · samuel cohen · Joey Bose · Brandon Amos · Maximilian Nickel · Aditya Grover · Ricky T. Q. Chen · Yaron Lipman -
2022 Spotlight: Matching Normalizing Flows and Probability Paths on Manifolds »
Heli Ben-Hamu · samuel cohen · Joey Bose · Brandon Amos · Maximilian Nickel · Aditya Grover · Ricky T. Q. Chen · Yaron Lipman -
2021 : Invited Talk 6 (Maximilian Nickel): Modeling Spatio-Temporal Events via Normalizing Flows »
Maximilian Nickel -
2021 Poster: CURI: A Benchmark for Productive Concept Learning Under Uncertainty »
Shanmukha Ramakrishna Vedantam · Arthur Szlam · Maximilian Nickel · Ari Morcos · Brenden Lake -
2021 Spotlight: CURI: A Benchmark for Productive Concept Learning Under Uncertainty »
Shanmukha Ramakrishna Vedantam · Arthur Szlam · Maximilian Nickel · Ari Morcos · Brenden Lake -
2018 Poster: Learning Continuous Hierarchies in the Lorentz Model of Hyperbolic Geometry »
Maximilian Nickel · Douwe Kiela -
2018 Oral: Learning Continuous Hierarchies in the Lorentz Model of Hyperbolic Geometry »
Maximilian Nickel · Douwe Kiela