Skip to yearly menu bar Skip to main content


Invited Talks

July 19, 2022, 6 a.m.

Given a machine learning model, what are the class of functions that can be approximated by this particular model efficiently, in the sense that the convergence rate for the approximation, estimation and optimization errors does not deteriorate as dimensionality goes up? We address this question for three classes of machine learning models: The random feature model, two-layer neural networks and the residual neural network model. During the process, we will also summarize the current status of the theoretical foundation of deep learning, and discuss some of the key open questions.


Weinan E

Weinan E is a professor at the Center for Machine Learning Research (CMLR) and the School of Mathematical Sciences at Peking University. He is also a professor at the Department of Mathematics and Program in Applied and Computational Mathematics at Princeton University. His main research interest is numerical algorithms, machine learning and multi-scale modeling, with applications to chemistry, material sciences and fluid mechanics.

Weinan E was awarded the ICIAM Collatz Prize in 2003, the SIAM Kleinman Prize in 2009 and the SIAM von Karman Prize in 2014, the SIAM-ETH Peter Henrici Prize in 2019, and the ACM Gordon-Bell Prize in 2020. He is a member of the Chinese Academy of Sciences, a fellow of SIAM, AMS and IOP. Weinan E is an invited plenary speaker at ICM 2022. He has also been an invited speaker at ICM 2002, ICIAM 2007 as well as the AMS National Meeting in 2003. In addition, he has been an invited speaker at APS, ACS, AIChe annual meetings, the World Congress of Computational Mechanics, and the American Conference of Theoretical Chemistry.

July 20, 2022, 6 a.m.

The first-generation models for drug discovery and clinical applications were mostly direct modifications of algorithms developed for NLP, computer vision, and other well-established application areas. However, deployment of these models revealed the significant mismatch between their basic assumptions and the needs of these new life sciences applications. Examples include challenging generalization scenarios, unknown biases in the collected data, and the inability of domain experts to validate model predictions. In my talk, I will illustrate some of these problems, and introduce our initial solutions to them.


Regina Barzilay

Regina Barzilay is an Israeli-American computer scientist. She is a professor at the Massachusetts Institute of Technology and a faculty lead for artificial intelligence at the MIT Jameel Clinic. Her research interests are in natural language processing and applications of deep learning to chemistry and oncology.

July 21, 2022, 6 a.m.

Abstract coming soon...


Aviv Regev

Aviv Regev is a computational biologist and systems biologist and Executive Vice President and Head of Genentech Research and Early Development in Genentech/Roche. She is a core member at the Broad Institute of MIT and Harvard and professor at the Department of Biology of the Massachusetts Institute of Technology.

In 2020, Regev became the Head and Executive Vice President of Genentech Research and Early Development, based in South San Francisco, and a member of the extended Corporate Executive Committee of Roche. Previously, she was a Core Institute Member (now on leave), Chair of the Faculty, Founding Director of the Klarman Cell Observatory and co-Director Cell Circuits Program at the Broad Institute of MIT and Harvard. She was also a professor in the Department of Biology at the Massachusetts Institute of Technology (now on leave), as well as an Investigator at the Howard Hughes Medical Institute. Regev's research includes work on gene expression (with Eran Segal and David Botstein), and the use of π-calculus to represent biochemical processes. Regev’s team has been a leading pioneer of single-cell genomics experimental and computational methods.

July 20, 2022, 12:15 p.m.

There is a fast growing literature in econometrics on estimating causal effects in settings with panel or longitudinal data, building on the recent difference-in-differences and synthetic control literatures. This is driven by an empirical literature with many applications. These range from settings with few cross-sectional units and many or few time periods, to many cross-sectional units. Sometimes there are few, or even only one treated unit, and sometimes many. I will review some of this recent literature including some of the examples, focusing in particular on the synthetic difference in differences estimator, and some of the relations with matrix completion literature. I will also discuss implications for randomized experiments. I will discuss some of the remaining challenges in the literature.


Guido Imbens

Guido Imbens is The Applied Econometrics Professor at the Stanford Graduate School of Business and Professor of Economics in the Economics Department at Stanford University. Currently he is also the Amman Mineral Faculty Fellow at the GSB. He has held tenured positions at UCLA, UC Berkeley, and Harvard University before joining Stanford in 2012. Imbens specializes in econometrics, and in particular methods for drawing causal inferences from experimental and observational data. He has published extensively in the leading economics and statistics journals. Together with Donald Rubin he has published a book, Causal Inference in Statistics, Social and Biomedical Sciences. Guido Imbens is a fellow of the Econometric Society, the Royal Holland Society of Sciences and Humanities, the Royal Netherlands Academy of Sciences, the American Academy of Arts and Sciences, and the American Statistical Association. He holds honorary doctorates from the University of St. Gallen and Brown University. In 2017 he received the Horace Mann medal at Brown University. In 2021 he shared the Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel with David Card and Joshua Angrist formethodological contributions to the analysis of causal relationship.’’ Currently Imbens is Editor of Econometrica.