Timezone: »

Welcome to the "Big Model" Era: Techniques and Systems to Train and Serve Bigger Models
Hao Zhang · Lianmin Zheng · Zhuohan Li · Ion Stoica

Mon Jul 18 12:30 PM -- 02:50 PM (PDT) @ Hall F
Event URL: https://sites.google.com/view/icml-2022-big-model »

In recent years, researchers in ML and systems have been working together to bring big models -- such as GPT-3 with 175B parameters -- into research and production. It has been revealed that increasing model sizes can significantly boost ML performance, and even lead to fundamentally new capabilities.

However, experimenting and adopting big models call for new techniques and systems to support their training and inference on big data and large clusters. This tutorial identifies research and practical pain points in model-parallel training and serving. In particular, this tutorial introduces new algorithmic techniques and system architectures for addressing the training and serving of popular big models, such as GPT-3, PaLM, and vision transformers. The tutorial also consists of a session on how to use the latest open-source system toolsets to support the training and serving of big models. Through this tutorial, we hope to lower the technical barrier of using big models in ML research and bring the big models to the masses.

Author Information

Hao Zhang (UC Berkeley)
Lianmin Zheng (UC Berkeley)
Zhuohan Li (UC Berkeley)
Ion Stoica (UC Berkeley)

More from the Same Authors