Timezone: »
In many applications where collecting data is expensive, for example neuroscience or medical imaging, the sample size is typically small compared to the feature dimension. It is challenging in this setting to train expressive, non-linear models without overfitting. These datasets call for intelligent regularization that exploits known structure, such as correlations between the features arising from the measurement device. However, existing structured regularizers need specially crafted solvers, which are difficult to apply to complex models. We propose a new regularizer specifically designed to leverage structure in the data in a way that can be applied efficiently to complex models. Our approach relies on feature grouping, using a fast clustering algorithm inside a stochastic gradient descent loop: given a family of feature groupings that capture feature covariations, we randomly select these groups at each iteration. We show that this approach amounts to enforcing a denoising regularizer on the solution. The method is easy to implement in many model architectures, such as fully connected neural networks, and has a linear computational cost. We apply this regularizer to a real-world fMRI dataset and the Olivetti Faces datasets. Experiments on both datasets demonstrate that the proposed approach produces models that generalize better than those trained with conventional regularizers, and also improves convergence speed.
Author Information
Sergul Aydore (Stevens Institute of Technology)
Sergul Aydore is an applied scientist at Amazon Web Services (AWS). Prior to AWS, Sergul was an Assistant Professor at the department of Electrical and Computer Engineering of Stevens Institute of Technology. She received her PhD degree from the Signal and Image Processing Institute at the University of Southern California in 2014. Her PhD work was on developing robust connectivity measures for neuroimaging data. She was the recipient of the Viterbi School of Engineering Doctoral Fellowship and was recognized as a 2014 USC Ming Hsieh Institute Ph.D. Scholar. Sergul has published in top-tier machine learning conferences such as ICML and NeurIPS on advancing generalization in machine learning models. She also served as an area chair in WiML at NeurIPS 2019. Her research at Stevens was supported by AWS ML Research Awards.
Thirion Bertrand (inria)
Gael Varoquaux (Inria)
Related Events (a corresponding poster, oral, or spotlight)
-
2019 Poster: Feature Grouping as a Stochastic Regularizer for High-Dimensional Structured Data »
Wed. Jun 12th 01:30 -- 04:00 AM Room Pacific Ballroom #121
More from the Same Authors
-
2022 Poster: Neural Language Models are not Born Equal to Fit Brain Data, but Training Helps »
Alexandre Pasquiou · Yair Lakretz · John Hale · Thirion Bertrand · Christophe Pallier -
2022 Spotlight: Neural Language Models are not Born Equal to Fit Brain Data, but Training Helps »
Alexandre Pasquiou · Yair Lakretz · John Hale · Thirion Bertrand · Christophe Pallier -
2021 Poster: Differentially Private Query Release Through Adaptive Projection »
Sergul Aydore · William Brown · Michael Kearns · Krishnaram Kenthapadi · Luca Melis · Aaron Roth · Ankit Siva -
2021 Oral: Differentially Private Query Release Through Adaptive Projection »
Sergul Aydore · William Brown · Michael Kearns · Krishnaram Kenthapadi · Luca Melis · Aaron Roth · Ankit Siva -
2020 Poster: Aggregation of Multiple Knockoffs »
Tuan-Binh Nguyen · Jerome-Alexis Chevalier · Thirion Bertrand · Sylvain Arlot