Timezone: »
Datasets often have their intrinsic symmetries, and particular deep-learning models called equivariant or invariant models have been developed to exploit these symmetries. However, if some or all of these symmetries are only approximate, which frequently happens in practice, these models may be suboptimal due to the architectural restrictions imposed on them. We tackle this issue of approximate symmetries in a setup where symmetries are mixed, i.e., they are symmetries of not single but multiple different types and the degree of approximation varies across these types. Instead of proposing a new architectural restriction as in most of the previous approaches, we present a regularizer-based method for building a model for a dataset with mixed approximate symmetries. The key component of our method is what we call equivariance regularizer for a given type of symmetries, which measures how much a model is equivariant with respect to the symmetries of the type. Our method is trained with these regularizers, one per each symmetry type, and the strength of the regularizers is automatically tuned during training, leading to the discovery of the approximation levels of some candidate symmetry types without explicit supervision. Using synthetic function approximation and motion forecasting tasks, we demonstrate that our method achieves better accuracy than prior approaches while discovering the approximate symmetry levels correctly.
Author Information
Hyunsu Kim (KAIST)
Hyungi Lee (KAIST)
Hongseok Yang (KAIST)
Juho Lee (KAIST, AITRICS)
More from the Same Authors
-
2023 : Function Space Bayesian Pseudocoreset for Bayesian Neural Networks »
Balhae Kim · Hyungi Lee · Juho Lee -
2023 : Early Exiting for Accelerated Inference in Diffusion Models »
Taehong Moon · Moonseok Choi · EungGu Yun · Jongmin Yoon · Gayoung Lee · Juho Lee -
2023 : Towards Safe Self-Distillation of Internet-Scale Text-to-Image Diffusion Models »
Sanghyun Kim · Seohyeon Jung · Balhae Kim · Moonseok Choi · Jinwoo Shin · Juho Lee -
2023 Poster: Probabilistic Imputation for Time-series Classification with Missing Data »
SeungHyun Kim · Hyunsu Kim · EungGu Yun · Hwangrae Lee · Jaehun Lee · Juho Lee -
2023 Poster: Scalable Set Encoding with Universal Mini-Batch Consistency and Unbiased Full Set Gradient Approximation »
Jeffrey Willette · Seanie Lee · Bruno Andreis · Kenji Kawaguchi · Juho Lee · Sung Ju Hwang -
2023 Poster: Traversing Between Modes in Function Space for Fast Ensembling »
EungGu Yun · Hyungi Lee · Giung Nam · Juho Lee -
2022 Poster: Improving Ensemble Distillation With Weight Averaging and Diversifying Perturbation »
Giung Nam · Hyungi Lee · Byeongho Heo · Juho Lee -
2022 Poster: Set Based Stochastic Subsampling »
Bruno Andreis · Seanie Lee · A. Tuan Nguyen · Juho Lee · Eunho Yang · Sung Ju Hwang -
2022 Spotlight: Set Based Stochastic Subsampling »
Bruno Andreis · Seanie Lee · A. Tuan Nguyen · Juho Lee · Eunho Yang · Sung Ju Hwang -
2022 Spotlight: Improving Ensemble Distillation With Weight Averaging and Diversifying Perturbation »
Giung Nam · Hyungi Lee · Byeongho Heo · Juho Lee -
2021 Poster: Adversarial Purification with Score-based Generative Models »
Jongmin Yoon · Sung Ju Hwang · Juho Lee -
2021 Spotlight: Adversarial Purification with Score-based Generative Models »
Jongmin Yoon · Sung Ju Hwang · Juho Lee -
2021 Poster: Probabilistic Programs with Stochastic Conditioning »
David Tolpin · Yuan Zhou · Tom Rainforth · Hongseok Yang -
2021 Spotlight: Probabilistic Programs with Stochastic Conditioning »
David Tolpin · Yuan Zhou · Tom Rainforth · Hongseok Yang -
2020 Poster: Divide, Conquer, and Combine: a New Inference Strategy for Probabilistic Programs with Stochastic Support »
Yuan Zhou · Hongseok Yang · Yee-Whye Teh · Tom Rainforth -
2018 Poster: On Nesting Monte Carlo Estimators »
Tom Rainforth · Rob Cornish · Hongseok Yang · andrew warrington · Frank Wood -
2018 Oral: On Nesting Monte Carlo Estimators »
Tom Rainforth · Rob Cornish · Hongseok Yang · andrew warrington · Frank Wood