• Stars
    star
    636
  • Rank 70,723 (Top 2 %)
  • Language
    Python
  • License
    MIT License
  • Created over 2 years ago
  • Updated 5 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

A toolbox for spectral compressive imaging reconstruction including MST (CVPR 2022), CST (ECCV 2022), DAUHST (NeurIPS 2022), BiSCI (NeurIPS 2023), HDNet (CVPR 2022), MST++ (CVPRW 2022), etc.

A Toolbox for Spectral Compressive Imaging

winner zhihu zhihu zhihu visitors

Authors

Yuanhao Cai*, Jing Lin*, Xiaowan Hu, Haoqian Wang, Xin Yuan, Yulun Zhang, Radu Timofte, and Luc Van Gool

Papers

Awards

ntire

News

  • 2023.02.26 : We release the RGB images of five real scenes and ten simulation scenes. Please feel free to check and use them. 🌟
  • 2022.11.02 : We have provided more visual results of state-of-the-art methods and the function to evaluate the parameters and computational complexity of models. Please feel free to check and use them. 🔆
  • 2022.10.23 : Code, models, and recontructed HSI results of DAUHST have been released. 🔥
  • 2022.09.15 : Our DAUHST has been accepted by NeurIPS 2022, code and models are coming soon. 🚀
  • 2022.07.20 : Code, models, and recontructed HSI results of CST have been released. 🔥
  • 2022.07.04 : Our paper CST has been accepted by ECCV 2022, code and models are coming soon. 🚀
  • 2022.06.14 : Code and models of MST and MST++ have been released. This repo supports 11 learning-based methods to serve as toolbox for Spectral Compressive Imaging. The model zoo will be enlarged. 🔥
  • 2022.05.20 : Our work DAUHST is on arxiv. 💫
  • 2022.04.02 : Further work MST++ has won the NTIRE 2022 Spectral Reconstruction Challenge. 🏆
  • 2022.03.09 : Our work CST is on arxiv. 💫
  • 2022.03.02 : Our paper MST has been accepted by CVPR 2022, code and models are coming soon. 🚀
Scene 2 Scene 3 Scene 4 Scene 7

1. Comparison with State-of-the-art Methods

This repo is a baseline and toolbox containing 11 learning-based algorithms for spectral compressive imaging.

Supported algorithms:

We are going to enlarge our model zoo in the future.

MST vs. SOTA CST vs. MST
MST++ vs. SOTA DAUHST vs. SOTA

Quantitative Comparison on Simulation Dataset

Method Params (M) FLOPS (G) PSNR SSIM Model Zoo Simulation Result Real Result
λ-Net 62.64 117.98 28.53 0.841 Google Drive / Baidu Disk Google Drive / Baidu Disk Google Drive / Baidu Disk
TSA-Net 44.25 110.06 31.46 0.894 Google Drive / Baidu Disk Google Drive / Baidu Disk Google Drive / Baidu Disk
DGSMP 3.76 646.65 32.63 0.917 Google Drive / Baidu Disk Google Drive / Baidu Disk Google Drive / Baidu Disk
GAP-Net 4.27 78.58 33.26 0.917 Google Drive / Baidu Disk Google Drive / Baidu Disk Google Drive / Baidu Disk
ADMM-Net 4.27 78.58 33.58 0.918 Google Drive / Baidu Disk Google Drive / Baidu Disk Google Drive / Baidu Disk
BIRNAT 4.40 2122.66 37.58 0.960 Google Drive / Baidu Disk Google Drive / Baidu Disk Google Drive / Baidu Disk
HDNet 2.37 154.76 34.97 0.943 Google Drive / Baidu Disk Google Drive / Baidu Disk Google Drive / Baidu Disk
MST-S 0.93 12.96 34.26 0.935 Google Drive / Baidu Disk Google Drive / Baidu Disk Google Drive / Baidu Disk
MST-M 1.50 18.07 34.94 0.943 Google Drive / Baidu Disk Google Drive / Baidu Disk Google Drive / Baidu Disk
MST-L 2.03 28.15 35.18 0.948 Google Drive / Baidu Disk Google Drive / Baidu Disk Google Drive / Baidu Disk
MST++ 1.33 19.42 35.99 0.951 Google Drive / Baidu Disk Google Drive / Baidu Disk Google Drive / Baidu Disk
CST-S 1.20 11.67 34.71 0.940 Google Drive / Baidu Disk Google Drive / Baidu Disk Google Drive / Baidu Disk
CST-M 1.36 16.91 35.31 0.947 Google Drive / Baidu Disk Google Drive / Baidu Disk Google Drive / Baidu Disk
CST-L 3.00 27.81 35.85 0.954 Google Drive / Baidu Disk Google Drive / Baidu Disk Google Drive / Baidu Disk
CST-L-Plus 3.00 40.10 36.12 0.957 Google Drive / Baidu Disk Google Drive / Baidu Disk Google Drive / Baidu Disk
DAUHST-2stg 1.40 18.44 36.34 0.952 Google Drive / Baidu Disk Google Drive /Baidu Disk Google Drive / Baidu Disk
DAUHST-3stg 2.08 27.17 37.21 0.959 Google Drive / Baidu Disk Google Drive / Baidu Disk Google Drive / Baidu Disk
DAUHST-5stg 3.44 44.61 37.75 0.962 Google Drive / Baidu Disk Google Drive / Baidu Disk Google Drive / Baidu Disk
DAUHST-9stg 6.15 79.50 38.36 0.967 Google Drive / Baidu Disk Google Drive / Baidu Disk Google Drive / Baidu Disk

The performance are reported on 10 scenes of the KAIST dataset. The test size of FLOPS is 256 x 256.

We also provide the RGB images of five real scenes and ten simulation scenes for your convenience to draw a figure.

Note: access code for Baidu Disk is mst1

2. Create Environment:

  • Python 3 (Recommend to use Anaconda)

  • NVIDIA GPU + CUDA

  • Python packages:

pip install -r requirements.txt

3. Prepare Dataset:

Download cave_1024_28 (Baidu Disk, code: fo0q | One Drive), CAVE_512_28 (Baidu Disk, code: ixoe | One Drive), KAIST_CVPR2021 (Baidu Disk, code: 5mmn | One Drive), TSA_simu_data (Baidu Disk, code: efu8 | One Drive), TSA_real_data (Baidu Disk, code: eaqe | One Drive), and then put them into the corresponding folders of datasets/ and recollect them as the following form:

|--MST
    |--real
    	|-- test_code
    	|-- train_code
    |--simulation
    	|-- test_code
    	|-- train_code
    |--visualization
    |--datasets
        |--cave_1024_28
            |--scene1.mat
            |--scene2.mat
            :  
            |--scene205.mat
        |--CAVE_512_28
            |--scene1.mat
            |--scene2.mat
            :  
            |--scene30.mat
        |--KAIST_CVPR2021  
            |--1.mat
            |--2.mat
            : 
            |--30.mat
        |--TSA_simu_data  
            |--mask.mat   
            |--Truth
                |--scene01.mat
                |--scene02.mat
                : 
                |--scene10.mat
        |--TSA_real_data  
            |--mask.mat   
            |--Measurements
                |--scene1.mat
                |--scene2.mat
                : 
                |--scene5.mat

Following TSA-Net and DGSMP, we use the CAVE dataset (cave_1024_28) as the simulation training set. Both the CAVE (CAVE_512_28) and KAIST (KAIST_CVPR2021) datasets are used as the real training set.

4. Simulation Experiement:

4.1 Training

cd MST/simulation/train_code/

# MST_S
python train.py --template mst_s --outf ./exp/mst_s/ --method mst_s 

# MST_M
python train.py --template mst_m --outf ./exp/mst_m/ --method mst_m  

# MST_L
python train.py --template mst_l --outf ./exp/mst_l/ --method mst_l 

# CST_S
python train.py --template cst_s --outf ./exp/cst_s/ --method cst_s 

# CST_M
python train.py --template cst_m --outf ./exp/cst_m/ --method cst_m  

# CST_L
python train.py --template cst_l --outf ./exp/cst_l/ --method cst_l

# CST_L_Plus
python train.py --template cst_l_plus --outf ./exp/cst_l_plus/ --method cst_l_plus

# GAP-Net
python train.py --template gap_net --outf ./exp/gap_net/ --method gap_net 

# ADMM-Net
python train.py --template admm_net --outf ./exp/admm_net/ --method admm_net 

# TSA-Net
python train.py --template tsa_net --outf ./exp/tsa_net/ --method tsa_net 

# HDNet
python train.py --template hdnet --outf ./exp/hdnet/ --method hdnet 

# DGSMP
python train.py --template dgsmp --outf ./exp/dgsmp/ --method dgsmp 

# BIRNAT
python train.py --template birnat --outf ./exp/birnat/ --method birnat 

# MST_Plus_Plus
python train.py --template mst_plus_plus --outf ./exp/mst_plus_plus/ --method mst_plus_plus 

# λ-Net
python train.py --template lambda_net --outf ./exp/lambda_net/ --method lambda_net

# DAUHST-2stg
python train.py --template dauhst_2stg --outf ./exp/dauhst_2stg/ --method dauhst_2stg

# DAUHST-3stg
python train.py --template dauhst_3stg --outf ./exp/dauhst_3stg/ --method dauhst_3stg

# DAUHST-5stg
python train.py --template dauhst_5stg --outf ./exp/dauhst_5stg/ --method dauhst_5stg

# DAUHST-9stg
python train.py --template dauhst_9stg --outf ./exp/dauhst_9stg/ --method dauhst_9stg

The training log, trained model, and reconstrcuted HSI will be available in MST/simulation/train_code/exp/ .

4.2 Testing

Download the pretrained model zoo from (Google Drive / Baidu Disk, code: mst1) and place them to MST/simulation/test_code/model_zoo/

Run the following command to test the model on the simulation dataset.

cd MST/simulation/test_code/

# MST_S
python test.py --template mst_s --outf ./exp/mst_s/ --method mst_s --pretrained_model_path ./model_zoo/mst/mst_s.pth

# MST_M
python test.py --template mst_m --outf ./exp/mst_m/ --method mst_m --pretrained_model_path ./model_zoo/mst/mst_m.pth

# MST_L
python test.py --template mst_l --outf ./exp/mst_l/ --method mst_l --pretrained_model_path ./model_zoo/mst/mst_l.pth

# CST_S
python test.py --template cst_s --outf ./exp/cst_s/ --method cst_s --pretrained_model_path ./model_zoo/cst/cst_s.pth

# CST_M
python test.py --template cst_m --outf ./exp/cst_m/ --method cst_m --pretrained_model_path ./model_zoo/cst/cst_m.pth

# CST_L
python test.py --template cst_l --outf ./exp/cst_l/ --method cst_l --pretrained_model_path ./model_zoo/cst/cst_l.pth

# CST_L_Plus
python test.py --template cst_l_plus --outf ./exp/cst_l_plus/ --method cst_l_plus --pretrained_model_path ./model_zoo/cst/cst_l_plus.pth

# GAP_Net
python test.py --template gap_net --outf ./exp/gap_net/ --method gap_net --pretrained_model_path ./model_zoo/gap_net/gap_net.pth

# ADMM_Net
python test.py --template admm_net --outf ./exp/admm_net/ --method admm_net --pretrained_model_path ./model_zoo/admm_net/admm_net.pth

# TSA_Net
python test.py --template tsa_net --outf ./exp/tsa_net/ --method tsa_net --pretrained_model_path ./model_zoo/tsa_net/tsa_net.pth

# HDNet
python test.py --template hdnet --outf ./exp/hdnet/ --method hdnet --pretrained_model_path ./model_zoo/hdnet/hdnet.pth

# DGSMP
python test.py --template dgsmp --outf ./exp/dgsmp/ --method dgsmp --pretrained_model_path ./model_zoo/dgsmp/dgsmp.pth

# BIRNAT
python test.py --template birnat --outf ./exp/birnat/ --method birnat --pretrained_model_path ./model_zoo/birnat/birnat.pth

# MST_Plus_Plus
python test.py --template mst_plus_plus --outf ./exp/mst_plus_plus/ --method mst_plus_plus --pretrained_model_path ./model_zoo/mst_plus_plus/mst_plus_plus.pth

# λ-Net
python test.py --template lambda_net --outf ./exp/lambda_net/ --method lambda_net --pretrained_model_path ./model_zoo/lambda_net/lambda_net.pth

# DAUHST-2stg
python test.py --template dauhst_2stg --outf ./exp/dauhst_2stg/ --method dauhst_2stg --pretrained_model_path ./model_zoo/dauhst_2stg/dauhst_2stg.pth

# DAUHST-3stg
python test.py --template dauhst_3stg --outf ./exp/dauhst_3stg/ --method dauhst_3stg --pretrained_model_path ./model_zoo/dauhst_3stg/dauhst_3stg.pth

# DAUHST-5stg
python test.py --template dauhst_5stg --outf ./exp/dauhst_5stg/ --method dauhst_5stg --pretrained_model_path ./model_zoo/dauhst_5stg/dauhst_5stg.pth

# DAUHST-9stg
python test.py --template dauhst_9stg --outf ./exp/dauhst_9stg/ --method dauhst_9stg --pretrained_model_path ./model_zoo/dauhst_9stg/dauhst_9stg.pth
  • The reconstrcuted HSIs will be output into MST/simulation/test_code/exp/

  • Place the reconstructed results into MST/simulation/test_code/Quality_Metrics/results and

Run cal_quality_assessment.m

to calculate the PSNR and SSIM of the reconstructed HSIs.

  • Evaluating the Params and FLOPS of models

    We have provided a function my_summary() in simulation/test_code/utils.py, please use this function to evaluate the parameters and computational complexity of the models, especially the Transformers as

from utils import my_summary
my_summary(MST(), 256, 256, 28, 1)

4.3 Visualization

  • Put the reconstruted HSI in MST/visualization/simulation_results/results and rename it as method.mat, e.g., mst_s.mat.

  • Generate the RGB images of the reconstructed HSIs

 cd MST/visualization/
 Run show_simulation.m 
  • Draw the spetral density lines
cd MST/visualization/
Run show_line.m

5. Real Experiement:

5.1 Training

cd MST/real/train_code/

# MST_S
python train.py --template mst_s --outf ./exp/mst_s/ --method mst_s 

# MST_M
python train.py --template mst_m --outf ./exp/mst_m/ --method mst_m  

# MST_L
python train.py --template mst_l --outf ./exp/mst_l/ --method mst_l 

# CST_S
python train.py --template cst_s --outf ./exp/cst_s/ --method cst_s 

# CST_M
python train.py --template cst_m --outf ./exp/cst_m/ --method cst_m  

# CST_L
python train.py --template cst_l --outf ./exp/cst_l/ --method cst_l

# CST_L_Plus
python train.py --template cst_l_plus --outf ./exp/cst_l_plus/ --method cst_l_plus

# GAP-Net
python train.py --template gap_net --outf ./exp/gap_net/ --method gap_net 

# ADMM-Net
python train.py --template admm_net --outf ./exp/admm_net/ --method admm_net 

# TSA-Net
python train.py --template tsa_net --outf ./exp/tsa_net/ --method tsa_net 

# HDNet
python train.py --template hdnet --outf ./exp/hdnet/ --method hdnet 

# DGSMP
python train.py --template dgsmp --outf ./exp/dgsmp/ --method dgsmp 

# BIRNAT
python train.py --template birnat --outf ./exp/birnat/ --method birnat 

# MST_Plus_Plus
python train.py --template mst_plus_plus --outf ./exp/mst_plus_plus/ --method mst_plus_plus 

# λ-Net
python train.py --template lambda_net --outf ./exp/lambda_net/ --method lambda_net

# DAUHST-2stg
python train.py --template dauhst_2stg --outf ./exp/dauhst_2stg/ --method dauhst_2stg

# DAUHST-3stg
python train.py --template dauhst_3stg --outf ./exp/dauhst_3stg/ --method dauhst_3stg

# DAUHST-5stg
python train.py --template dauhst_5stg --outf ./exp/dauhst_5stg/ --method dauhst_5stg

# DAUHST-9stg
python train.py --template dauhst_9stg --outf ./exp/dauhst_9stg/ --method dauhst_9stg

The training log, trained model, and reconstrcuted HSI will be available in MST/real/train_code/exp/

5.2 Testing

cd MST/real/test_code/

# MST_S
python test.py --template mst_s --outf ./exp/mst_s/ --method mst_s --pretrained_model_path ./model_zoo/mst/mst_s.pth

# MST_M
python test.py --template mst_m --outf ./exp/mst_m/ --method mst_m --pretrained_model_path ./model_zoo/mst/mst_m.pth

# MST_L
python test.py --template mst_l --outf ./exp/mst_l/ --method mst_l --pretrained_model_path ./model_zoo/mst/mst_l.pth

# CST_S
python test.py --template cst_s --outf ./exp/cst_s/ --method cst_s --pretrained_model_path ./model_zoo/cst/cst_s.pth

# CST_M
python test.py --template cst_m --outf ./exp/cst_m/ --method cst_m --pretrained_model_path ./model_zoo/cst/cst_m.pth

# CST_L
python test.py --template cst_l --outf ./exp/cst_l/ --method cst_l --pretrained_model_path ./model_zoo/cst/cst_l.pth

# CST_L_Plus
python test.py --template cst_l_plus --outf ./exp/cst_l_plus/ --method cst_l_plus --pretrained_model_path ./model_zoo/cst/cst_l_plus.pth

# GAP_Net
python test.py --template gap_net --outf ./exp/gap_net/ --method gap_net --pretrained_model_path ./model_zoo/gap_net/gap_net.pth

# ADMM_Net
python test.py --template admm_net --outf ./exp/admm_net/ --method admm_net --pretrained_model_path ./model_zoo/admm_net/admm_net.pth

# TSA_Net
python test.py --template tsa_net --outf ./exp/tsa_net/ --method tsa_net --pretrained_model_path ./model_zoo/tsa_net/tsa_net.pth

# HDNet
python test.py --template hdnet --outf ./exp/hdnet/ --method hdnet --pretrained_model_path ./model_zoo/hdnet/hdnet.pth

# DGSMP
python test.py --template dgsmp --outf ./exp/dgsmp/ --method dgsmp --pretrained_model_path ./model_zoo/dgsmp/dgsmp.pth

# BIRNAT
python test.py --template birnat --outf ./exp/birnat/ --method birnat --pretrained_model_path ./model_zoo/birnat/birnat.pth

# MST_Plus_Plus
python test.py --template mst_plus_plus --outf ./exp/mst_plus_plus/ --method mst_plus_plus --pretrained_model_path ./model_zoo/mst_plus_plus/mst_plus_plus.pth

# λ-Net
python test.py --template lambda_net --outf ./exp/lambda_net/ --method lambda_net --pretrained_model_path ./model_zoo/lambda_net/lambda_net.pth

# DAUHST_2stg
python test.py --template dauhst --outf ./exp/dauhst_2stg/ --method dauhst_2stg --pretrained_model_path ./model_zoo/dauhst/dauhst_2stg.pth

# DAUHST_3stg
python test.py --template dauhst --outf ./exp/dauhst_3stg/ --method dauhst_3stg --pretrained_model_path ./model_zoo/dauhst/dauhst_3stg.pth

# DAUHST_5stg
python test.py --template dauhst --outf ./exp/dauhst_5stg/ --method dauhst_5stg --pretrained_model_path ./model_zoo/dauhst/dauhst_5stg.pth

# DAUHST_9stg
python test.py --template dauhst --outf ./exp/dauhst_9stg/ --method dauhst_9stg --pretrained_model_path ./model_zoo/dauhst/dauhst_9stg.pth
  • The reconstrcuted HSI will be output into MST/real/test_code/exp/

5.3 Visualization

  • Put the reconstruted HSI in MST/visualization/real_results/results and rename it as method.mat, e.g., mst_plus_plus.mat.

  • Generate the RGB images of the reconstructed HSI

cd MST/visualization/
Run show_real.m

6. Citation

If this repo helps you, please consider citing our works:

# MST
@inproceedings{mst,
  title={Mask-guided Spectral-wise Transformer for Efficient Hyperspectral Image Reconstruction},
  author={Yuanhao Cai and Jing Lin and Xiaowan Hu and Haoqian Wang and Xin Yuan and Yulun Zhang and Radu Timofte and Luc Van Gool},
  booktitle={CVPR},
  year={2022}
}


# CST
@inproceedings{cst,
  title={Coarse-to-Fine Sparse Transformer for Hyperspectral Image Reconstruction},
  author={Yuanhao Cai and Jing Lin and Xiaowan Hu and Haoqian Wang and Xin Yuan and Yulun Zhang and Radu Timofte and Luc Van Gool},
  booktitle={ECCV},
  year={2022}
}


# DAUHST
@inproceedings{dauhst,
  title={Degradation-Aware Unfolding Half-Shuffle Transformer for Spectral Compressive Imaging},
  author={Yuanhao Cai and Jing Lin and Haoqian Wang and Xin Yuan and Henghui Ding and Yulun Zhang and Radu Timofte and Luc Van Gool},
  booktitle={NeurIPS}, 
  year={2022}
}


# MST++
@inproceedings{mst_pp,
  title={MST++: Multi-stage Spectral-wise Transformer for Efficient Spectral Reconstruction},
  author={Yuanhao Cai and Jing Lin and Zudi Lin and Haoqian Wang and Yulun Zhang and Hanspeter Pfister and Radu Timofte and Luc Van Gool},
  booktitle={CVPRW},
  year={2022}
}


# HDNet
@inproceedings{hdnet,
  title={HDNet: High-resolution Dual-domain Learning for Spectral Compressive Imaging},
  author={Xiaowan Hu and Yuanhao Cai and Jing Lin and  Haoqian Wang and Xin Yuan and Yulun Zhang and Radu Timofte and Luc Van Gool},
  booktitle={CVPR},
  year={2022}
}