( events)   Timezone: »  
The 2020 schedule is still incomplete Program Highlights »
Workshop
Fri Jul 17 05:00 AM -- 01:35 PM (PDT)
Challenges in Deploying and Monitoring Machine Learning Systems
Alessandra Tosi · Nathan Korda · Neil Lawrence





Until recently, Machine Learning has been mostly applied in industry by consulting academics, data scientists within larger companies, and a number of dedicated Machine Learning research labs within a few of the world’s most innovative tech companies. Over the last few years we have seen the dramatic rise of companies dedicated to providing Machine Learning software-as-a-service tools, with the aim of democratizing access to the benefits of Machine Learning. All these efforts have revealed major hurdles to ensuring the continual delivery of good performance from deployed Machine Learning systems. These hurdles range from challenges in MLOps, to fundamental problems with deploying certain algorithms, to solving the legal issues surrounding the ethics involved in letting algorithms make decisions for your business.

This workshop will invite papers related to the challenges in deploying and monitoring ML systems. It will encourage submission on: subjects related to MLOps for deployed ML systems (such as testing ML systems, debugging ML systems, monitoring ML systems, debugging ML Models, deploying ML at scale); subjects related to the ethics around deploying ML systems (such as ensuring fairness, trust and transparency of ML systems, providing privacy and security on ML Systems); useful tools and programming languages for deploying ML systems; specific challenges relating to deploying reinforcement learning in ML systems
and performing continual learning and providing continual delivery in ML systems;
and finally data challenges for deployed ML systems.

Opening remarks (Talk)
Alessandra Tosi, Nathan Korda
Deploying Machine Learning Models in a Developing Country (Invited talk)
Ernest Mwebaze
System-wide Monitoring Architectures with Explanations (Invited talk)
Leilani Gilpin
First Break (Break)
Bridging the gap between research and production in machine learning (Invited talk)
Chip Nguyen
Monitoring and explainability of models in production (Contributed talk)
Janis Klaise
Gradient-Based Monitoring of Learning Machines (Contributed talk)
Lang Liu
Not Your Grandfather's Test Set: Reducing Labeling Effort for Testing (Contributed talk)
Begum Taskazan
Carbontracker: Tracking and Predicting the Carbon Footprint of Training Deep Learning Models (Contributed talk)
Lasse F. Wolff Anthony
Serverless inferencing on Kubernetes (Contributed talk)
Clive Cox
Do You Sign Your Model? (Contributed talk)
Omid Aramoon
PareCO: Pareto-aware Channel Optimization for Slimmable Neural Networks (Contributed talk)
Ting-wu Chin
Technology Readiness Levels for Machine Learning Systems (Contributed talk)
Alexander Lavin
Poster session
Janis Klaise, Lang Liu, Begum Taskazan, Lasse F. Wolff Anthony, Clive Cox, Omid Aramoon, Ting-wu Chin, Alexander Lavin
Second Break (Break)
Open Problems Panel (Panel)
Alessandra Tosi, Nathan Korda, Yuzhui Liu, Zhenwen Dai, Zhenwen Dai, Alexander Lavin, Erick Galinkin, Camylle Lanteigne
Third break (Break)
Conservative Exploration in Bandits and Reinforcement Learning (Invited talk)
Mohammad Ghavamzadeh
Successful Data Science in Production Systems: It’s All About Assumptions (Invited talk)
Nevena Lalic
Panel discussion (Panel)
Neil Lawrence, Mohammad Ghavamzadeh, Leilani Gilpin, Chip Nguyen, Ernest Mwebaze, Nevena Lalic