Timezone: »

Probabilistically Robust Learning: Balancing Average- and Worst-case Performance
Alex Robey · Luiz F. O. Chamon · George J. Pappas · Hamed Hassani

Wed Jul 20 02:00 PM -- 02:05 PM (PDT) @ Room 301 - 303

Many of the successes of machine learning are based on minimizing an averaged loss function. However, it is well-known that this paradigm suffers from robustness issues that hinder its applicability in safety-critical domains. These issues are often addressed by training against worst-case perturbations of data, a technique known as adversarial training. Although empirically effective, adversarial training can be overly conservative, leading to unfavorable trade-offs between nominal performance and robustness. To this end, in this paper we propose a framework called probabilistic robustness that bridges the gap between the accurate, yet brittle average case and the robust, yet conservative worst case by enforcing robustness to most rather than to all perturbations. From a theoretical point of view, this framework overcomes the trade-offs between the performance and the sample-complexity of worst-case and average-case learning. From a practical point of view, we propose a novel algorithm based on risk-aware optimization that effectively balances average- and worst-case performance at a considerably lower computational cost relative to adversarial training. Our results on MNIST, CIFAR-10, and SVHN illustrate the advantages of this framework on the spectrum from average- to worst-case robustness. Our code is available at: https://github.com/arobey1/advbench.

Author Information

Alex Robey (University of Pennsylvania)
Luiz F. O. Chamon (University of California, Berkeley)
Luiz F. O. Chamon

Luiz F. O. Chamon received the B.Sc. and M.Sc. degrees in electrical engineering from the University of São Paulo, São Paulo, Brazil, in 2011 and 2015 and the Ph.D. degree in electrical and systems engineering from the University of Pennsylvania (Penn), Philadelphia, in 2020. He is currently a postdoc of the Simons Institute of the University of California, Berkeley. In 2009, he was an undergraduate exchange student of the Masters in Acoustics of the École Centrale de Lyon, Lyon, France, and worked as an Assistant Instructor and Consultant on nondestructive testing at INSACAST Formation Continue. From 2010 to 2014, he worked as a Signal Processing and Statistics Consultant on a research project with EMBRAER. In 2018, he was recognized by the IEEE Signal Processing Society for his distinguished work for the editorial board of the IEEE Transactions on Signal Processing. He also received both the best student paper and the best paper awards at IEEE ICASSP 2020. His research interests include optimization, signal processing, machine learning, statistics, and control.

George J. Pappas (University of Pennsylvania)

George J. Pappas is the Joseph Moore Professor and Chair of the Department of Electrical and Systems Engineering at the University of Pennsylvania. He also holds a secondary appointment in the Departments of Computer and Information Sciences, and Mechanical Engineering and Applied Mechanics. He is member of the GRASP Lab and the PRECISE Center. He has previously served as the Deputy Dean for Research in the School of Engineering and Applied Science. His research focuses on control theory and in particular, hybrid systems, embedded systems, hierarchical and distributed control systems, with applications to unmanned aerial vehicles, distributed robotics, green buildings, and biomolecular networks. He is a Fellow of IEEE, and has received various awards such as the Antonio Ruberti Young Researcher Prize, the George S. Axelby Award, the O. Hugo Schuck Best Paper Award, the National Science Foundation PECASE, and the George H. Heilmeier Faculty Excellence Award.

Hamed Hassani (University of Pennsylvania)
Hamed Hassani

I am an assistant professor in the Department of Electrical and Systems Engineering (as of July 2017). I hold a secondary appointment in the Department of Computer and Information Systems. I am also a faculty affiliate of the Warren Center for Network and Data Sciences. Before joining Penn, I was a research fellow at the Simons Institute, UC Berkeley (program: Foundations of Machine Learning). Prior to that, I was a post-doctoral scholar and lecturer in the Institute for Machine Learning at ETH Zürich. I received my Ph.D. degree in Computer and Communication Sciences from EPFL.

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors