Timezone: »

Are You Man Enough? Even Fair Algorithms Conform to Societal Norms
Myra Cheng · Maria De-Arteaga · Lester Mackey · Adam Tauman Kalai

We introduce Societal Norm Bias (SNoB), a subtle but consequential type of discrimination that may be exhibited by machine learning classification algorithms, even when these systems achieve group fairness objectives. This work illuminates the gap between definitions of algorithmic group fairness and concerns of harm based on adherence to societal norms. We study this issue through the lens of gender bias in occupation classification from online biographies. We quantify SNoB by measuring how an algorithm's predictions are associated with gender norms. This framework reveals that for classification tasks related to male-dominated occupations, fairness-aware classifiers favor biographies whose language aligns with masculine gender norms. We compare SNoB across fairness intervention techniques, finding that post-processing interventions do not mitigate this bias at all.

Author Information

Myra Cheng (California Institute of Technology)
Maria De-Arteaga (University of Texas at Austin)
Lester Mackey (Microsoft Research)
Adam Tauman Kalai (Microsoft Research)

More from the Same Authors