Timezone: »

 
Poster
The Dynamics of Learning: A Random Matrix Approach
Zhenyu Liao · Romain Couillet

Wed Jul 11 09:15 AM -- 12:00 PM (PDT) @ Hall B #189

Understanding the learning dynamics of neural networks is one of the key issues for the improvement of optimization algorithms as well as for the theoretical comprehension of why deep neural nets work so well today. In this paper, we introduce a random matrix-based framework to analyze the learning dynamics of a single-layer linear network on a binary classification problem, for data of simultaneously large dimension and size, trained by gradient descent. Our results provide rich insights into common questions in neural nets, such as overfitting, early stopping and the initialization of training, thereby opening the door for future studies of more elaborate structures and models appearing in today's neural networks.

Author Information

Zhenyu Liao (L2S, CentraleSupelec)

Zhenyu Liao received his Ph.D. in applied math and informatics in 2019 from University of Paris-Saclay, France. In 2020 he was a postdoctoral researcher with the Department of Statistics, University of California, Berkeley. He is currently an assistant professor at Huazhong University of Science and Technology (HUST), China. His research interests are broadly in machine learning, signal processing, random matrix theory, and high-dimensional statistics. He published more than 20 papers on top-tier machine learning conferences such as ICML, NeurIPS, ICLR, COLT, AISTATS, etc., and he co-authored the book “Random Matrix Methods for Machine Learning.”

Romain Couillet (CentralSupélec)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors