Timezone: »

Data Augmentation vs. Equivariant Networks: A Theoretical Study of Generalizability on Dynamics Forecasting
Rui Wang · Robin Walters · Rose Yu

Exploiting symmetry in structured data is a powerful way to improve the learning and generalization ability of deep learning models. Data augmentation and equivariant neural nets are two of the main approaches for enabling neural nets to preserve symmetries. Since real-world data is rarely strictly symmetric, recently, several approximately equivariant networks have also been introduced. In this work, we theoretically compare the generalizability of data augmentation techniques, strictly equivariant networks, and approximately equivariant networks.Unlike most prior theoretical works on symmetry that are based on the i.i.d assumption, we instead focus on generalizability of these three approaches on the task of non-stationary dynamics forecasting.

Author Information

Rui Wang (University of California, San Diego)
Robin Walters (Northeastern University)
Rose Yu (University of California, San Diego)
Rose Yu

Dr. Rose Yu is an assistant professor at the University of California San Diego, Department of Computer Science and Engineering. She earned her Ph.D. in Computer Sciences at USC in 2017. She was subsequently a Postdoctoral Fellow at Caltech. Her research focuses on advancing machine learning techniques for large-scale spatiotemporal data analysis, with applications to sustainability, health, and physical sciences. A particular emphasis of her research is on physics-guided AI which aims to integrate first principles with data-driven models. Among her awards, she has won NSF CAREER Award, Faculty Research Award from JP Morgan, Facebook, Google, Amazon, and Adobe, Several Best Paper Awards, Best Dissertation Award at USC, and was nominated as one of the ’MIT Rising Stars in EECS’.

More from the Same Authors