Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Differentiable Almost Everything: Differentiable Relaxations, Algorithms, Operators, and Simulators

SelMix: Selective Mixup Fine Tuning for Optimizing Non-Decomposable Metrics

shrinivas ramasubramanian · Harsh Rangwani · Sho Takemori · Kunal Samanta · Yuhei Umeda · Venkatesh Babu Radhakrishnan


Abstract:

Natural data often has class imbalance. This can make it difficult for machine learning models to learn to classify minority classes accurately. In-dustrial machine-learning applications often have objectives beyond just accuracy. For example, models may be required to meet certain fairness criteria, such as not being biased against the classes with fewer samples. These objectives are often non-decomposable in nature. SelMix is a fine-tuning technique that can be used to improve the performance of machine learning models on imbalanced data. The core idea of our framework is to determine a sampling distribution to performa mixup of features between samples from particular classes such that it optimizes the given objective. We evaluate our technique against the existing empirical methods on standard benchmark datasets for imbalanced classification.

Chat is not available.