Skip to yearly menu bar Skip to main content


Neural Compression: From Information Theory to Applications

Berivan Isik · Yibo Yang · Daniel Severo · Karen Ullrich · Robert Bamler · Stephan Mandt

Meeting Room 317 A

This workshop aims to address fundamental problems in the young but potentially highly-impactful field of machine-learning-based methods for data compression and communication. We invite participates to exchange ideas on fundamental issues in neural compression such as the role of quantization and stochasticity in communication, characterization and estimation of information measures, and more resource-efficient models/methods. We aim to address these fundamental issues by bringing together researchers from various fields including machine learning, information theory, statistics, and computer vision.

Chat is not available.
Timezone: America/Los_Angeles