Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Models of Human Feedback for AI Alignment

Query Design for Crowdsourced Clustering: Effect of Cognitive Overload and Contextual Bias

Yi Chen · Ramya Vinayak

[ ] [ Project Page ]
Fri 26 Jul 8 a.m. PDT — 8 a.m. PDT

Abstract:

Crowdsourced clustering leverages human input to group items into clusters. The design of tasks for crowdworkers, specifically the number of items presented per query, impacts answer quality and cognitive load. This work investigates the trade-off between query size and answer accuracy, revealing diminishing returns beyond 4-5 items per query. Crucially, we identify contextual bias in crowdworker responses – the likelihood of grouping items depends not only on their similarity but also on the other items present in the query. This structured noise contradicts assumptions made in existing noise models. Our findings underscore the need for more nuanced noise models that account for the complex interplay between items and query context in crowdsourced clustering tasks.

Chat is not available.