Workshop
3rd Workshop on High-dimensional Learning Dynamics (HiLD)
Atish Agarwala · Jason Lee · Bruno Loureiro · Aukosh Jagannath · Inbar Seroussi
Modern machine learning applications face the challenge of extracting insights from high-dimensional datasets. The 3rd High-dimensional Learning Dynamics (HiLD) Workshop focuses on predicting and analyzing the behavior of learning algorithms in regimes where both the number of samples and parameters are large. This workshop aims to advance research and foster collaboration in several key areas:
- Developing tractable models and dynamical frameworks to explain phenomena observed in deep neural networks (DNNs) and foundation models;
- Establishing mathematical frameworks for neural scaling laws as network width and depth approach infinity;
- Identifying and characterizing relevant observable quantities in high-dimensional limits;
- Understanding the provable effects of optimization algorithms, hyperparameters, and neural architectures on training and test dynamics.
The HiLD Workshop will unite experts from random matrix theory, optimization, high-dimensional statistics/probability, and statistical physics to share diverse perspectives on these challenges. By bringing together theorists and practitioners from machine learning with researchers from these adjacent fields, we aim to create new collaborations between communities that often do not interact. Through talks, poster sessions, and panel discussions, the workshop will explore the fundamental dynamics of learning algorithms in high-dimensional settings. This year's workshop theme is "Navigating Complexity: Feature Learning Dynamics at Scale."
Live content is unavailable. Log in and register to view live content