Reference-Free Meta-Learning for Generalized Implicit Neural Representation in Efficient MRI Reconstruction
Abstract
Implicit Neural Representation (INR) has emerged as a powerful paradigm for continuous MRI reconstruction. However, standard unsupervised INR requires time-consuming optimization from scratch for each scan, hindering clinical deployment. This work presents IPOD, a Reference-Free Meta-Learning framework designed to learn generalized parameter initializations for INR directly from undersampled data. Distinct from conventional meta-learning that relies on fully-sampled ground truth, IPOD operates in an inverse-problem-driven manner, leveraging diverse reconstruction tasks with varying sampling patterns to capture a robust prior. Furthermore, we introduce an adaptive meta-update strategy modulated by task-specific performance to ensure optimal parameter distribution for diverse anatomical structures. Extensive experiments demonstrate that IPOD provides a superior initialization that enables rapid adaptation and achieves high-fidelity reconstruction across various imaging protocols, significantly outperforming existing INR baselines. By eliminating the dependence on reference images, IPOD offers a scalable and efficient solution for a wide range of imaging inverse problems. Code and data available at: https://anonymous.4open.science/r/iPod-2C60