Home

Awesome

PWC PWC

PWC PWC

A Toolbox for Spectral Compressive Imaging

winner zhihu zhihu zhihu

Authors

Yuanhao Cai*, Jing Lin*, Xiaowan Hu, Haoqian Wang, Xin Yuan, Yulun Zhang, Radu Timofte, and Luc Van Gool

Papers

Awards

<img src="./figure/ntire.png" height=240> <img src="./figure/NTIRE_2024.png" height=240>

Introduction

This is a baseline and toolbox for spectral compressive imaging reconstruction. This repo supports over 15 algorithms. Our method MST++ won the NTIRE 2022 Challenge on spectral recovery from RGB images. If you find this repo useful, please give it a star ⭐ and consider citing our paper in your research. Thank you.

News

Scene 2Scene 3Scene 4Scene 7
<img src="./figure/frame2channel12.gif" height=170 width=170><img src="./figure/frame3channel21.gif" width=170 height=170><img src="./figure/frame4channel28.gif" width=170 height=170><img src="./figure/frame7channel4.gif" width=170 height=170>

 

1. Comparison with State-of-the-art Methods

12 learning-based algorithms and 3 model-based methods are supported.

<details open> <summary><b>Supported algorithms:</b></summary> </details>

We are going to enlarge our model zoo in the future.

MST vs. SOTACST vs. MST
<img src="./figure/compare_fig.png" height=320><img src="./figure/cst_mst.png" height=320>
MST++ vs. SOTADAUHST vs. SOTA
<img src="./figure/mst_pp.png" height=320><img src="./figure/dauhst.png" height=320>
BiSRNet vs. SOTA BNNs
<img src="./figure/bisrnet.png" width=812>

Quantitative Comparison on Simulation Dataset

MethodParams (M)FLOPS (G)PSNRSSIMModel ZooSimulation ResultReal Result
TwIST--23.120.669-Google Drive / Baidu DiskGoogle Drive / Baidu Disk
GAP-TV--24.360.669-Google Drive / Baidu DiskGoogle Drive / Baidu Disk
DeSCI--25.270.721-Google Drive / Baidu DiskGoogle Drive / Baidu Disk
λ-Net62.64117.9828.530.841Google Drive / Baidu DiskGoogle Drive / Baidu DiskGoogle Drive / Baidu Disk
TSA-Net44.25110.0631.460.894Google Drive / Baidu DiskGoogle Drive / Baidu DiskGoogle Drive / Baidu Disk
DGSMP3.76646.6532.630.917Google Drive / Baidu DiskGoogle Drive / Baidu DiskGoogle Drive / Baidu Disk
GAP-Net4.2778.5833.260.917Google Drive / Baidu DiskGoogle Drive / Baidu DiskGoogle Drive / Baidu Disk
ADMM-Net4.2778.5833.580.918Google Drive / Baidu DiskGoogle Drive / Baidu DiskGoogle Drive / Baidu Disk
BIRNAT4.402122.6637.580.960Google Drive / Baidu DiskGoogle Drive / Baidu DiskGoogle Drive / Baidu Disk
HDNet2.37154.7634.970.943Google Drive / Baidu DiskGoogle Drive / Baidu DiskGoogle Drive / Baidu Disk
MST-S0.9312.9634.260.935Google Drive / Baidu DiskGoogle Drive / Baidu DiskGoogle Drive / Baidu Disk
MST-M1.5018.0734.940.943Google Drive / Baidu DiskGoogle Drive / Baidu DiskGoogle Drive / Baidu Disk
MST-L2.0328.1535.180.948Google Drive / Baidu DiskGoogle Drive / Baidu DiskGoogle Drive / Baidu Disk
MST++1.3319.4235.990.951Google Drive / Baidu DiskGoogle Drive / Baidu DiskGoogle Drive / Baidu Disk
CST-S1.2011.6734.710.940Google Drive / Baidu DiskGoogle Drive / Baidu DiskGoogle Drive / Baidu Disk
CST-M1.3616.9135.310.947Google Drive / Baidu DiskGoogle Drive / Baidu DiskGoogle Drive / Baidu Disk
CST-L3.0027.8135.850.954Google Drive / Baidu DiskGoogle Drive / Baidu DiskGoogle Drive / Baidu Disk
CST-L-Plus3.0040.1036.120.957Google Drive / Baidu DiskGoogle Drive / Baidu DiskGoogle Drive / Baidu Disk
DAUHST-2stg1.4018.4436.340.952Google Drive / Baidu DiskGoogle Drive /Baidu DiskGoogle Drive / Baidu Disk
DAUHST-3stg2.0827.1737.210.959Google Drive / Baidu DiskGoogle Drive / Baidu DiskGoogle Drive / Baidu Disk
DAUHST-5stg3.4444.6137.750.962Google Drive / Baidu DiskGoogle Drive / Baidu DiskGoogle Drive / Baidu Disk
DAUHST-9stg6.1579.5038.360.967Google Drive / Baidu DiskGoogle Drive / Baidu DiskGoogle Drive / Baidu Disk
BiSRNet0.0361.1829.760.837Google Drive / Baidu DiskGoogle Drive / Baidu DiskGoogle Drive / Baidu Disk

The performance are reported on 10 scenes of the KAIST dataset. The test size of FLOPS is 256 x 256.

We also provide the RGB images of five real scenes and ten simulation scenes for your convenience to draw a figure.

Note: access code for Baidu Disk is mst1

 

2. Create Environment:

  pip install -r requirements.txt

 

3. Prepare Dataset:

Download cave_1024_28 (Baidu Disk, code: fo0q | One Drive), CAVE_512_28 (Baidu Disk, code: ixoe | One Drive), KAIST_CVPR2021 (Baidu Disk, code: 5mmn | One Drive), TSA_simu_data (Baidu Disk, code: efu8 | One Drive), TSA_real_data (Baidu Disk, code: eaqe | One Drive), and then put them into the corresponding folders of datasets/ and recollect them as the following form:

|--MST
    |--real
    	|-- test_code
    	|-- train_code
    |--simulation
    	|-- test_code
    	|-- train_code
    |--visualization
    |--datasets
        |--cave_1024_28
            |--scene1.mat
            |--scene2.mat
            :  
            |--scene205.mat
        |--CAVE_512_28
            |--scene1.mat
            |--scene2.mat
            :  
            |--scene30.mat
        |--KAIST_CVPR2021  
            |--1.mat
            |--2.mat
            : 
            |--30.mat
        |--TSA_simu_data  
            |--mask.mat   
            |--Truth
                |--scene01.mat
                |--scene02.mat
                : 
                |--scene10.mat
        |--TSA_real_data  
            |--mask.mat   
            |--Measurements
                |--scene1.mat
                |--scene2.mat
                : 
                |--scene5.mat

Following TSA-Net and DGSMP, we use the CAVE dataset (cave_1024_28) as the simulation training set. Both the CAVE (CAVE_512_28) and KAIST (KAIST_CVPR2021) datasets are used as the real training set.

 

4. Simulation Experiement:

4.1 Training

cd MST/simulation/train_code/

# MST_S
python train.py --template mst_s --outf ./exp/mst_s/ --method mst_s 

# MST_M
python train.py --template mst_m --outf ./exp/mst_m/ --method mst_m  

# MST_L
python train.py --template mst_l --outf ./exp/mst_l/ --method mst_l 

# CST_S
python train.py --template cst_s --outf ./exp/cst_s/ --method cst_s 

# CST_M
python train.py --template cst_m --outf ./exp/cst_m/ --method cst_m  

# CST_L
python train.py --template cst_l --outf ./exp/cst_l/ --method cst_l

# CST_L_Plus
python train.py --template cst_l_plus --outf ./exp/cst_l_plus/ --method cst_l_plus

# GAP-Net
python train.py --template gap_net --outf ./exp/gap_net/ --method gap_net 

# ADMM-Net
python train.py --template admm_net --outf ./exp/admm_net/ --method admm_net 

# TSA-Net
python train.py --template tsa_net --outf ./exp/tsa_net/ --method tsa_net 

# HDNet
python train.py --template hdnet --outf ./exp/hdnet/ --method hdnet 

# DGSMP
python train.py --template dgsmp --outf ./exp/dgsmp/ --method dgsmp 

# BIRNAT
python train.py --template birnat --outf ./exp/birnat/ --method birnat 

# MST_Plus_Plus
python train.py --template mst_plus_plus --outf ./exp/mst_plus_plus/ --method mst_plus_plus 

# λ-Net
python train.py --template lambda_net --outf ./exp/lambda_net/ --method lambda_net

# DAUHST-2stg
python train.py --template dauhst_2stg --outf ./exp/dauhst_2stg/ --method dauhst_2stg

# DAUHST-3stg
python train.py --template dauhst_3stg --outf ./exp/dauhst_3stg/ --method dauhst_3stg

# DAUHST-5stg
python train.py --template dauhst_5stg --outf ./exp/dauhst_5stg/ --method dauhst_5stg

# DAUHST-9stg
python train.py --template dauhst_9stg --outf ./exp/dauhst_9stg/ --method dauhst_9stg

# BiSRNet
python train.py --template bisrnet --outf ./exp/bisrnet/ --method bisrnet

4.2 Testing

Download the pretrained model zoo from (Google Drive / Baidu Disk, code: mst1) and place them to MST/simulation/test_code/model_zoo/

Run the following command to test the model on the simulation dataset.

cd MST/simulation/test_code/

# MST_S
python test.py --template mst_s --outf ./exp/mst_s/ --method mst_s --pretrained_model_path ./model_zoo/mst/mst_s.pth

# MST_M
python test.py --template mst_m --outf ./exp/mst_m/ --method mst_m --pretrained_model_path ./model_zoo/mst/mst_m.pth

# MST_L
python test.py --template mst_l --outf ./exp/mst_l/ --method mst_l --pretrained_model_path ./model_zoo/mst/mst_l.pth

# CST_S
python test.py --template cst_s --outf ./exp/cst_s/ --method cst_s --pretrained_model_path ./model_zoo/cst/cst_s.pth

# CST_M
python test.py --template cst_m --outf ./exp/cst_m/ --method cst_m --pretrained_model_path ./model_zoo/cst/cst_m.pth

# CST_L
python test.py --template cst_l --outf ./exp/cst_l/ --method cst_l --pretrained_model_path ./model_zoo/cst/cst_l.pth

# CST_L_Plus
python test.py --template cst_l_plus --outf ./exp/cst_l_plus/ --method cst_l_plus --pretrained_model_path ./model_zoo/cst/cst_l_plus.pth

# GAP_Net
python test.py --template gap_net --outf ./exp/gap_net/ --method gap_net --pretrained_model_path ./model_zoo/gap_net/gap_net.pth

# ADMM_Net
python test.py --template admm_net --outf ./exp/admm_net/ --method admm_net --pretrained_model_path ./model_zoo/admm_net/admm_net.pth

# TSA_Net
python test.py --template tsa_net --outf ./exp/tsa_net/ --method tsa_net --pretrained_model_path ./model_zoo/tsa_net/tsa_net.pth

# HDNet
python test.py --template hdnet --outf ./exp/hdnet/ --method hdnet --pretrained_model_path ./model_zoo/hdnet/hdnet.pth

# DGSMP
python test.py --template dgsmp --outf ./exp/dgsmp/ --method dgsmp --pretrained_model_path ./model_zoo/dgsmp/dgsmp.pth

# BIRNAT
python test.py --template birnat --outf ./exp/birnat/ --method birnat --pretrained_model_path ./model_zoo/birnat/birnat.pth

# MST_Plus_Plus
python test.py --template mst_plus_plus --outf ./exp/mst_plus_plus/ --method mst_plus_plus --pretrained_model_path ./model_zoo/mst_plus_plus/mst_plus_plus.pth

# λ-Net
python test.py --template lambda_net --outf ./exp/lambda_net/ --method lambda_net --pretrained_model_path ./model_zoo/lambda_net/lambda_net.pth

# DAUHST-2stg
python test.py --template dauhst_2stg --outf ./exp/dauhst_2stg/ --method dauhst_2stg --pretrained_model_path ./model_zoo/dauhst_2stg/dauhst_2stg.pth

# DAUHST-3stg
python test.py --template dauhst_3stg --outf ./exp/dauhst_3stg/ --method dauhst_3stg --pretrained_model_path ./model_zoo/dauhst_3stg/dauhst_3stg.pth

# DAUHST-5stg
python test.py --template dauhst_5stg --outf ./exp/dauhst_5stg/ --method dauhst_5stg --pretrained_model_path ./model_zoo/dauhst_5stg/dauhst_5stg.pth

# DAUHST-9stg
python test.py --template dauhst_9stg --outf ./exp/dauhst_9stg/ --method dauhst_9stg --pretrained_model_path ./model_zoo/dauhst_9stg/dauhst_9stg.pth

# BiSRNet
python test.py --template bisrnet --outf ./exp/bisrnet/ --method bisrnet --pretrained_model_path ./model_zoo/bisrnet/bisrnet.pth
Run cal_quality_assessment.m
from utils import my_summary, my_summary_bnn
my_summary(MST(), 256, 256, 28, 1)
my_summary_bnn(BiSRNet(), 256, 256, 28, 1)

4.3 Visualization

 cd MST/visualization/
 Run show_simulation.m 
cd MST/visualization/
Run show_line.m

 

5. Real Experiement:

5.1 Training

cd MST/real/train_code/

# MST_S
python train.py --template mst_s --outf ./exp/mst_s/ --method mst_s 

# MST_M
python train.py --template mst_m --outf ./exp/mst_m/ --method mst_m  

# MST_L
python train.py --template mst_l --outf ./exp/mst_l/ --method mst_l 

# CST_S
python train.py --template cst_s --outf ./exp/cst_s/ --method cst_s 

# CST_M
python train.py --template cst_m --outf ./exp/cst_m/ --method cst_m  

# CST_L
python train.py --template cst_l --outf ./exp/cst_l/ --method cst_l

# CST_L_Plus
python train.py --template cst_l_plus --outf ./exp/cst_l_plus/ --method cst_l_plus

# GAP-Net
python train.py --template gap_net --outf ./exp/gap_net/ --method gap_net 

# ADMM-Net
python train.py --template admm_net --outf ./exp/admm_net/ --method admm_net 

# TSA-Net
python train.py --template tsa_net --outf ./exp/tsa_net/ --method tsa_net 

# HDNet
python train.py --template hdnet --outf ./exp/hdnet/ --method hdnet 

# DGSMP
python train.py --template dgsmp --outf ./exp/dgsmp/ --method dgsmp 

# BIRNAT
python train.py --template birnat --outf ./exp/birnat/ --method birnat 

# MST_Plus_Plus
python train.py --template mst_plus_plus --outf ./exp/mst_plus_plus/ --method mst_plus_plus 

# λ-Net
python train.py --template lambda_net --outf ./exp/lambda_net/ --method lambda_net

# DAUHST-2stg
python train.py --template dauhst_2stg --outf ./exp/dauhst_2stg/ --method dauhst_2stg

# DAUHST-3stg
python train.py --template dauhst_3stg --outf ./exp/dauhst_3stg/ --method dauhst_3stg

# DAUHST-5stg
python train.py --template dauhst_5stg --outf ./exp/dauhst_5stg/ --method dauhst_5stg

# DAUHST-9stg
python train.py --template dauhst_9stg --outf ./exp/dauhst_9stg/ --method dauhst_9stg

# BiSRNet
python train_s.py --outf ./exp/bisrnet/ --method bisrnet

5.2 Testing

The pretrained model of BiSRNet can be download from (Google Drive / Baidu Disk, code: mst1) and place them to MST/real/test_code/model_zoo/

cd MST/real/test_code/

# MST_S
python test.py --outf ./exp/mst_s/ --pretrained_model_path ./model_zoo/mst/mst_s.pth

# MST_M
python test.py --outf ./exp/mst_m/ --pretrained_model_path ./model_zoo/mst/mst_m.pth

# MST_L
python test.py  --outf ./exp/mst_l/ --pretrained_model_path ./model_zoo/mst/mst_l.pth

# CST_S
python test.py --outf ./exp/cst_s/ --pretrained_model_path ./model_zoo/cst/cst_s.pth

# CST_M
python test.py --outf ./exp/cst_m/ --pretrained_model_path ./model_zoo/cst/cst_m.pth

# CST_L
python test.py --outf ./exp/cst_l/ --pretrained_model_path ./model_zoo/cst/cst_l.pth

# CST_L_Plus
python test.py --outf ./exp/cst_l_plus/ --pretrained_model_path ./model_zoo/cst/cst_l_plus.pth

# GAP_Net
python test.py --outf ./exp/gap_net/ --pretrained_model_path ./model_zoo/gap_net/gap_net.pth

# ADMM_Net
python test.py --outf ./exp/admm_net/ --pretrained_model_path ./model_zoo/admm_net/admm_net.pth

# TSA_Net
python test.py --outf ./exp/tsa_net/ --pretrained_model_path ./model_zoo/tsa_net/tsa_net.pth

# HDNet
python test.py --template hdnet --outf ./exp/hdnet/ --method hdnet --pretrained_model_path ./model_zoo/hdnet/hdnet.pth

# DGSMP
python test.py --outf ./exp/dgsmp/ --pretrained_model_path ./model_zoo/dgsmp/dgsmp.pth

# BIRNAT
python test.py --outf ./exp/birnat/ --pretrained_model_path ./model_zoo/birnat/birnat.pth

# MST_Plus_Plus
python test.py --outf ./exp/mst_plus_plus/ --pretrained_model_path ./model_zoo/mst_plus_plus/mst_plus_plus.pth

# λ-Net
python test.py --outf ./exp/lambda_net/ --pretrained_model_path ./model_zoo/lambda_net/lambda_net.pth

# DAUHST_2stg
python test.py --outf ./exp/dauhst_2stg/ --pretrained_model_path ./model_zoo/dauhst/dauhst_2stg.pth

# DAUHST_3stg
python test.py --outf ./exp/dauhst_3stg/ --pretrained_model_path ./model_zoo/dauhst/dauhst_3stg.pth

# DAUHST_5stg
python test.py --outf ./exp/dauhst_5stg/ --pretrained_model_path ./model_zoo/dauhst/dauhst_5stg.pth

# DAUHST_9stg
python test.py --outf ./exp/dauhst_9stg/ --pretrained_model_path ./model_zoo/dauhst/dauhst_9stg.pth

# BiSRNet
python test.py --outf ./exp/bisrnet  --pretrained_model_path ./model_zoo/bisrnet/bisrnet.pth --method bisrnet

5.3 Visualization

cd MST/visualization/
Run show_real.m

 

6. Citation

If this repo helps you, please consider citing our works:



# MST
@inproceedings{mst,
  title={Mask-guided Spectral-wise Transformer for Efficient Hyperspectral Image Reconstruction},
  author={Yuanhao Cai and Jing Lin and Xiaowan Hu and Haoqian Wang and Xin Yuan and Yulun Zhang and Radu Timofte and Luc Van Gool},
  booktitle={CVPR},
  year={2022}
}


# CST
@inproceedings{cst,
  title={Coarse-to-Fine Sparse Transformer for Hyperspectral Image Reconstruction},
  author={Yuanhao Cai and Jing Lin and Xiaowan Hu and Haoqian Wang and Xin Yuan and Yulun Zhang and Radu Timofte and Luc Van Gool},
  booktitle={ECCV},
  year={2022}
}


# DAUHST
@inproceedings{dauhst,
  title={Degradation-Aware Unfolding Half-Shuffle Transformer for Spectral Compressive Imaging},
  author={Yuanhao Cai and Jing Lin and Haoqian Wang and Xin Yuan and Henghui Ding and Yulun Zhang and Radu Timofte and Luc Van Gool},
  booktitle={NeurIPS}, 
  year={2022}
}


# BiSCI
@inproceedings{bisci,
  title={Binarized Spectral Compressive Imaging},
  author={Yuanhao Cai and Yuxin Zheng and Jing Lin and Xin Yuan and Yulun Zhang and Haoqian Wang},
  booktitle={NeurIPS},
  year={2023}
}


# MST++
@inproceedings{mst_pp,
  title={MST++: Multi-stage Spectral-wise Transformer for Efficient Spectral Reconstruction},
  author={Yuanhao Cai and Jing Lin and Zudi Lin and Haoqian Wang and Yulun Zhang and Hanspeter Pfister and Radu Timofte and Luc Van Gool},
  booktitle={CVPRW},
  year={2022}
}


# HDNet
@inproceedings{hdnet,
  title={HDNet: High-resolution Dual-domain Learning for Spectral Compressive Imaging},
  author={Xiaowan Hu and Yuanhao Cai and Jing Lin and  Haoqian Wang and Xin Yuan and Yulun Zhang and Radu Timofte and Luc Van Gool},
  booktitle={CVPR},
  year={2022}
}