ICML 2019
Skip to yearly menu bar Skip to main content


Workshop

Coding Theory For Large-scale Machine Learning

Viveck Cadambe · Pulkit Grover · Dimitris Papailiopoulos · Gauri Joshi

202

Coding Theory For Large-scale Machine Learning

Coding theory involves the art and science of how to add redundancy to data to ensure that a desirable output is obtained at despite deviations from ideal behavior from the system components that interact with the data. Through a rich, mathematically elegant set of techniques, coding theory has come to significantly influence the design of modern data communications, compression and storage systems. The last few years have seen a rapidly growing interest in coding theory based approaches for the development of efficient machine learning algorithms towards robust, large-scale, distributed computational pipelines.

The CodML workshop brings together researchers developing coding techniques for machine learning, as well as researchers working on systems implementations for computing, with cutting-edge presentations from both sides. The goal is to learn about non-idealities in system components as well as approaches to obtain reliable and robust learning despite these non-idealities, and identify problems of future interest.

The workshop is co-located with ICML 2019, and will be held in Long Beach, California, USA on June 14th or 15th, 2019.

Please see the website for more details:

Call for Posters

Scope of the Workshop

In this workshop we solicit research papers focused on the application of coding and information-theoretic techniques for distributed machine learning. More broadly, we seek papers that address the problem of making machine learning more scalable, efficient, and robust. Both theoretical as well as experimental contributions are welcome. We invite authors to submit papers on topics including but not limited to:

  • Asynchronous Distributed Training Methods
  • Communication-Efficient Training
  • Model Compression and Quantization
  • Gradient Coding, Compression and Quantization
  • Erasure Coding Techniques for Straggler Mitigation
  • Data Compression in Large-scale Machine Learning
  • Erasure Coding Techniques for ML Hardware Acceleration
  • Fast, Efficient and Scalable Inference
  • Secure and Private Machine Learning
  • Data Storage/Access for Machine Learning Jobs
  • Performance evaluation of coding techniques

Submission Format and Instructions

The authors should prepare extended abstracts in the ICML paper format and submit via CMT. Submitted papers may not exceed three (3) single-spaced double-column pages excluding references. All results, proofs, figures, tables must be included in the 3 pages. The submitted manuscripts should include author names and affiliations, and an abstract that does not exceed 250 words. The authors may include a link to an extended version of the paper that includes supplementary material (proofs, experimental details, etc.) but the reviewers are not required to read the extended version.

Dual Submission Policy

Accepted submissions will be considered non-archival and can be submitted elsewhere without modification, as long as the other conference allows it. Moreover, submissions to CodML based on work recently accepted to other venues are also acceptable (though authors should explicitly make note of this in their submissions).

Key Dates

Paper Submission: May 3rd, 2019, 11:59 PM anywhere on earth

Decision Notification: May 12th, 2019.

Workshop date: June 14 or 15, 2019

Live content is unavailable. Log in and register to view live content

Timezone: America/Los_Angeles

Schedule

Log in and register to view live content