Skip to yearly menu bar Skip to main content


Oral
in
Workshop: Machine Learning for Multimodal Healthcare Data

Interpretable and Intervenable Ultrasonography-based Machine Learning Models for Pediatric Appendicitis

Julia Vogt · Ricards Marcinkevics · Patricia Reis Wolfertstetter · Ugne Klimiene · Kieran Chin-Cheong · Alyssia Paschke · Julia Zerres · Markus Denzinger · David Niederberger · Sven Wellmann · Ece Ozkan · Christian Knorr

Keywords: [ Co-creation and human-in-the-loop ] [ Medical Imaging ] [ Multimodal fusion ] [ Data sparsity, incompleteness and complexity ]


Abstract:

Appendicitis is among the most frequent reasons for pediatric abdominal surgeries. With recent advances in machine learning, data-driven decision support could help clinicians diagnose and manage patients while reducing the number of non-critical surgeries. However, previous decision support systems for appendicitis have focused on clinical, laboratory, scoring, and computed tomography data and have ignored the use of abdominal ultrasound, despite its noninvasive nature and widespread availability. In this work, we present interpretable machine learning models for predicting the diagnosis, management and severity of suspected appendicitis using ultrasound images. To this end, our approach utilizes concept bottleneck models (CBM) that facilitate interpretation and interaction with high-level concepts that are understandable to clinicians. Furthermore, we extend CBMs to prediction problems with multiple views and incomplete concept sets. Our models were trained on a dataset comprising 579 pediatric patients with 1709 ultrasound images accompanied by clinical and laboratory data. Results show that our proposed method enables clinicians to utilize a human-understandable and intervenable predictive model without compromising performance or requiring time-consuming image annotation when deployed.

Chat is not available.