Skip to yearly menu bar Skip to main content


Morning Poster
in
Workshop: Artificial Intelligence & Human Computer Interaction

An Interactive Human-Machine Learning Interface for Collecting and Learning from Complex Annotations

Jonathan Erskine · Raul Santos-Rodriguez · Alexander Hepburn · Matt Clifford


Abstract:

Human-Computer Interaction has been shown to lead to improvements in machine learning systems by boosting model performance, accelerating learning and building user confidence. In this work, we propose a human-machine learning interface for binary classification tasks with the goal of allowing humans to provide richer forms of supervision and feedback that go beyond standard binary labels as annotations for a dataset. We aim to reverse the expectation that human annotators adapt to the constraints imposed by labels, by allowing for extra flexibility in the form that supervision information is collected. For this, we introduce the concept of task-oriented meta-evaluations and propose a prototype tool to efficiently capture the human insights or knowledge about a task. Finally we discuss the challenges which face future extensions of this work.

Chat is not available.