Skip to yearly menu bar Skip to main content


Poster
in
Workshop: ICML 2024 Workshop on Foundation Models in the Wild

PLUTO: Pathology-Universal Transformer

Dinkar Juyal · Harshith Padigela · Chintan Shah · Daniel Shenker · Natalia Harguindeguy · Yi Liu · Blake Martin · Yibo Zhang · Michael Nercessian · Miles Markey · Isaac Finberg · Kelsey Luu · Daniel Borders · Syed Ashar Javed · Emma Krause · Raymond Biju · Aashish Sood · Allen Ma · Jackson Nyman · John Shamshoian · Guillaume Chhor · Darpan Sanghavi · Marc Thibault · Limin Yu · Fedaa Najdawi · Jennifer Hipp · Darren Fahy · Benjamin Glass · Eric Walk · John Abel · Harsha pokkalla · Andrew Beck · Sean Grullon

Keywords: [ Digital pathology ] [ foundation model ]


Abstract:

Pathology images provide a unique challenge for computer-vision-based analysis: a single whole slide image is gigapixel-sized and often contains hundreds of thousands to millions of objects of interest across multiple resolutions. In this work, we propose PathoLogy Universal TransfOrmer (PLUTO): a light-weight pathology foundation model (FM) that is pre-trained on a diverse dataset of 195 million image tiles collected from multiple sites. We design task-specific adaptation heads that utilize PLUTO's output embeddings for tasks ranging from subcellular- to slide-scale, including instance segmentation, tile classification, and slide-level prediction. We find that PLUTO matches or outperforms existing task-specific baselines and pathology-specific FMs, some of which use orders-of-magnitude larger datasets and model sizes. Our findings present a path towards a universal embedding to power pathology image analysis, and motivate further exploration around pathology FMs in terms of data diversity, architectural improvements, sample efficiency, and practical deployability in real-world applications.

Chat is not available.