Timezone: »

 
Workshop
Neural Compression: From Information Theory to Applications
Berivan Isik · Yibo Yang · Daniel Severo · Karen Ullrich · Robert Bamler · Stephan Mandt

Sat Jul 29 12:00 PM -- 08:00 PM (PDT) @ Meeting Room 317 A
Event URL: https://neuralcompression.github.io/workshop23 »

This workshop aims to address fundamental problems in the young but potentially highly-impactful field of machine-learning-based methods for data compression and communication. We invite participates to exchange ideas on fundamental issues in neural compression such as the role of quantization and stochasticity in communication, characterization and estimation of information measures, and more resource-efficient models/methods. We aim to address these fundamental issues by bringing together researchers from various fields including machine learning, information theory, statistics, and computer vision.

Author Information

Berivan Isik (Stanford University)
Yibo Yang (University of California, Irivine)
Daniel Severo (University of Toronto Vector Institute for AI)
Karen Ullrich (Meta AI)
Robert Bamler (University of Tübingen)
Stephan Mandt (University of California, Irivine)

Stephan Mandt is an Assistant Professor of Computer Science at the University of California, Irvine. From 2016 until 2018, he was a Senior Researcher and head of the statistical machine learning group at Disney Research, first in Pittsburgh and later in Los Angeles. He held previous postdoctoral positions at Columbia University and at Princeton University. Stephan holds a PhD in Theoretical Physics from the University of Cologne. He is a Fellow of the German National Merit Foundation, a Kavli Fellow of the U.S. National Academy of Sciences, and was a visiting researcher at Google Brain. Stephan serves regularly as an Area Chair for NeurIPS, ICML, AAAI, and ICLR, and is a member of the Editorial Board of JMLR. His research is currently supported by NSF, DARPA, IBM, and Qualcomm.

More from the Same Authors