Poster
in
Workshop: Humans, Algorithmic Decision-Making and Society: Modeling Interactions and Impact
Query Design for Crowdsourced Clustering: Effect of Cognitive Overload and Contextual Bias
Yi Chen · Ramya Vinayak
Crowdsourced clustering leverages human input to group items into clusters. The design of tasks for crowdworkers, specifically the number of items presented per query, impacts answer quality and cognitive load. This work investigates the trade-off between query size and answer accuracy, revealing diminishing returns beyond 4-5 items per query. Crucially, we identify contextual bias in crowdworker responses – the likelihood of grouping items depends not only on their similarity but also on the other items present in the query. This structured noise contradicts assumptions made in existing noise models. Our findings underscore the need for more nuanced noise models that account for the complex interplay between items and query context in crowdsourced clustering tasks.