Skip to yearly menu bar Skip to main content


Poster
in
Workshop: The Second Workshop on Spurious Correlations, Invariance and Stability

Group Robustness via Adaptive Class-Specific Scaling

Seonguk Seo · Bohyung Han


Abstract:

Group distributionally robust optimization, which aims to improve robust accuracies such as worst-group or unbiased accuracy, is one of the mainstream algorithms to mitigate spurious correlation and handle dataset bias. Existing approaches have apparently improved robust accuracy, but in fact these performance gains mainly come from trade-offs at the expense of average accuracy. To address the challenges, we first propose a simple class-specific scaling strategy to control the trade-off between robust and average accuracies flexibly and efficiently, which is directly applicable to existing debiasing algorithms without additional training; it reveals that a naive ERM baseline matches or even outperforms the recent debiasing approaches by only adopting the class-specific scaling. Then, we employ this technique to evaluate the performance of existing algorithms in a comprehensive manner by introducing a novel unified metric that summarizes the trade-off between the two accuracies as a scalar value. We also develop an instance-wise adaptive scaling technique for overcoming the trade-off and improving the performance even further in terms of both accuracies. We perform experiments on the datasets in computer vision and natural language processing domains and verify the effectiveness of the proposed frameworks. By considering the inherent trade-off, our frameworks provide meaningful insights in existing robust approaches beyond comparing only the robust accuracy.

Chat is not available.