Awesome
PromptMRG
Code of AAAI 2024 paper: "PromptMRG: Diagnosis-Driven Prompts for Medical Report Generation".
Installation
- Clone this repository.
git clone https://github.com/jhb86253817/PromptMRG.git
- Create a new conda environment.
conda create -n promptmrg python=3.10
conda activate promptmrg
- Install the dependencies in requirements.txt.
pip install -r requirements.txt
Datasets Preparation
- MIMIC-CXR: The images can be downloaded from either physionet or R2Gen. The annotation file can be downloaded from the Google Drive. Additionally, you need to download
clip_text_features.json
from here, the extracted text features of the training database via MIMIC pretrained CLIP. Put all these under folderdata/mimic_cxr/
. - IU-Xray: The images can be downloaded from R2Gen and the annotation file can be downloaded from the Google Drive. Put both images and annotation under folder
data/iu_xray/
.
Moreover, you need to download the chexbert.pth
from here for evaluating clinical efficacy and put it under checkpoints/stanford/chexbert/
.
You will have the following structure:
PromptMRG
|--data
|--mimic_cxr
|--base_probs.json
|--clip_text_features.json
|--mimic_annotation_promptmrg.json
|--images
|--p10
|--p11
...
|--iu_xray
|--iu_annotation_promptmrg.json
|--images
|--CXR1000_IM-0003
|--CXR1001_IM-0004
...
|--checkpoints
|--stanford
|--chexbert
|--chexbert.pth
...
Training
- To train a model by yourself, run
bash train_mimic_cxr.sh
to train a model on MIMIC-CXR. - Alternatively, you can download a trained model weight from here. Note that this model weight was trained with images from R2Gen. If you use images processed by yourself, you may obtain degraded performance with this weight. In this case, you need to train a model by yourself.
Testing
Run bash test_mimic_cxr.sh
to test a trained model on MIMIC-CXR and bash test_iu_xray.sh
for IU-Xray.