Awesome
PromptMR
This repository contains the pytorch implementation of PromptMR, an unrolled model for multi-coil MRI reconstruction. See our paper Fill the K-Space and Refine the Image: Prompting for Dynamic and Multi-Contrast MRI Reconstruction for more details.
Updates
-
[2024/01/04] 📃 We provide the additional PromptMR evaluation results on the calgary-campinas and fastMRI Multi-coil Brain dataset. Pretrained models and code will be released in the future.
-
[2023/10/15] 🔥 We have released training and inference code, along with pretrained PromptMR models, for both the CMRxRecon and fastMRI multi-coil knee datasets.
-
[2023/10/12] 🥇 We secured 1st place in both the Cine and T1/T2 Mapping tasks for cardiac MR reconstruction in the CMRxRecon Challenge during MICCAI 2023! [Final ranking]
Method
Overview of PromptMR: an all-in-one unrolled model for MRI reconstruction. Adjacent inputs, depicted in image domain for visual clarity, provide neighboring k-space information for reconstruction. To accommodate different input varieties, the input-type adaptive visual prompt is integrated into each cascade of the unrolled architecture to guide the reconstruction process.
<div align="center"> <img src="assets/promptmr.png" width="900"/> </div>Overview of the PromptUnet: the denoiser in each cascade of PromptMR. The PromptBlocks can generate adaptively learned prompts across multiple levels, which integrate with decoder features in the UpBlocks to allow rich hierachical context learning.
<div align="center"> <img src="assets/promptunet.png" width="500"/> </div>Installation and Data Preparation
See INSTALL.md for installation instructions and data preparation required to run this codebase.
Training/Inference Codes & Pretrained models
FastMRI multi-coil knee dataset
Citation
If you found this repository useful to you, please consider giving a star ⭐️ and citing our paper:
@article{xin2023fill,
title={Fill the K-Space and Refine the Image: Prompting for Dynamic and Multi-Contrast MRI Reconstruction},
author={Xin, Bingyu and Ye, Meng and Axel, Leon and Metaxas, Dimitris N},
journal={arXiv preprint arXiv:2309.13839},
year={2023}
}