• Stars
    star
    124
  • Rank 288,207 (Top 6 %)
  • Language
    Python
  • License
    MIT License
  • Created almost 3 years ago
  • Updated about 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

[CVPR'22] HyperTransformer: A Textural and Spectral Feature Fusion Transformer for Pansharpening

HyperTransformer: A Textural and Spectral Feature Fusion Transformer for Pansharpening (CVPR'22)

Wele Gedara Chaminda Bandara, and Vishal M. Patel

For more information, please see our

Summary

Setting up a virtual conda environment

Setup a virtual conda environment using the provided environment.yml file or requirements.txt.

conda env create --name HyperTransformer --file environment.yaml
conda activate HyperTransformer

or

conda create --name HyperTransformer --file requirements.txt
conda activate HyperTransformer

Download datasets

We use three publically available HSI datasets for experiments, namely

  1. Pavia Center scene Download the .mat file here, and save it in "./datasets/pavia_centre/Pavia_centre.mat".
  2. Botswana datasetDownload the .mat file here, and save it in "./datasets/botswana4/Botswana.mat".
  3. Chikusei dataset Download the .mat file here, and save it in "./datasets/chikusei/chikusei.mat".

Processing the datasets to generate LR-HSI, PAN, and Reference-HR-HSI using Wald's protocol

We use Wald's protocol to generate LR-HSI and PAN image. To generate those cubic patches,

  1. Run process_pavia.m in ./datasets/pavia_centre/ to generate cubic patches.
  2. Run process_botswana.m in ./datasets/botswana4/ to generate cubic patches.
  3. Run process_chikusei.m in ./datasets/chikusei/ to generate cubic patches.

Training HyperTransformer

We use two stage procedure to train our HyperTransformer.

We first train the backbone of HyperTrasnformer and then fine-tune the MHFA modules. This way we get better results and faster convergence instead of training whole network at once.

Training the Backbone of HyperTrasnformer

Use the following codes to pre-train HyperTransformer on the three datasets.

  1. Pre-training on Pavia Center Dataset:

    Change "train_dataset" to "pavia_dataset" in config_HSIT_PRE.json.

    Then use following commad to pre-train on Pavia Center dataset. python train.py --config configs/config_HSIT_PRE.json.

  2. Pre-training on Botswana Dataset: Change "train_dataset" to "botswana4_dataset" in config_HSIT_PRE.json.

    Then use following commad to pre-train on Pavia Center dataset. python train.py --config configs/config_HSIT_PRE.json.

  3. Pre-training on Chikusei Dataset:

    Change "train_dataset" to "chikusei_dataset" in config_HSIT_PRE.json.

    Then use following commad to pre-train on Pavia Center dataset. python train.py --config configs/config_HSIT_PRE.json.

Fine-tuning the MHFA modules in HyperTrasnformer

Next, we fine-tune the MHFA modules in HyperTransformer starting from pre-trained backbone from the previous step.

  1. Fine-tuning MHFA on Pavia Center Dataset:

    Change "train_dataset" to "pavia_dataset" in config_HSIT.json.

    Then use the following commad to train HyperTransformer on Pavia Center dataset.

    Please specify path to best model obtained from previous step using --resume. python train.py --config configs/config_HSIT.json --resume ./Experiments/HSIT_PRE/pavia_dataset/N_modules\(4\)/best_model.pth.

  2. Fine-tuning on Botswana Dataset:

    Change "train_dataset" to "botswana4_dataset" in config_HSIT.json.

    Then use following commad to pre-train on Pavia Center dataset.

    python train.py --config configs/config_HSIT.json --resume ./Experiments/HSIT_PRE/botswana4/N_modules\(4\)/best_model.pth.

  3. Fine-tuning on Chikusei Dataset:

    Change "train_dataset" to "chikusei_dataset" in config_HSIT.json.

    Then use following commad to pre-train on Pavia Center dataset.

    python train.py --config configs/config_HSIT.json --resume ./Experiments/HSIT_PRE/chikusei_dataset/N_modules\(4\)/best_model.pth.

Trained models and pansharpened results on test-set

You can download trained models and final prediction outputs through the follwing links for each dataset.

  1. Pavia Center: Download here
  2. Botswana: Download here
  3. Chikusei: Download here

Citation

If you find our work useful, please consider citing our paper.

@InProceedings{Bandara_2022_CVPR,
    author    = {Bandara, Wele Gedara Chaminda and Patel, Vishal M.},
    title     = {HyperTransformer: A Textural and Spectral Feature Fusion Transformer for Pansharpening},
    booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
    month     = {June},
    year      = {2022},
    pages     = {1767-1777}
}

More Repositories

1

ChangeFormer

[IGARSS'22]: A Transformer-Based Siamese Network for Change Detection
Python
420
star
2

ddpm-cd

Remote Sensing Change Detection using Denoising Diffusion Probabilistic Models
Python
260
star
3

SemiCD

Revisiting Consistency Regularization for Semi-supervised Change Detection in Remote Sensing Images
Python
120
star
4

SPIN_RoadMapper

Official implementation of our ICRA'22 paper: SPIN Road Mapper: Extracting Roads from Aerial Images via Spatial and Interaction Space Graph Reasoning for Autonomous Driving
Jupyter Notebook
79
star
5

adamae

[CVPR'23] AdaMAE: Adaptive Masking for Efficient Spatiotemporal Learning with Masked Autoencoders
Python
71
star
6

DIP-HyperKite

[IEEE TGRS] DIP-HyperKite: Hyperspectral Pansharpening Based on Improved Deep Image Prior and Residual Reconstruction
Python
56
star
7

Metric-CD

Official PyTorch implementation of Deep Metric Learning for Unsupervised Change Detection in Remote Sensing Images
Jupyter Notebook
15
star
8

mix-bt

Official PyTorch Implementation of Guarding Barlow Twins Against Overfitting with Mixed Samples
Python
13
star
9

apt

PyTorch Implementation of Attention Prompt Tuning: Parameter-Efficient Adaptation of Pre-Trained Models for Action Recognition
Python
13
star
10

CD-SOTA-methods

Remote sensing change detection: state of the art methods and datasets
6
star
11

Complete_State-Estimation_Algorithm

A Complete State Estimation Algorithm for a Three-Phase Four-Wire Low Voltage Distribution System with High Penetration of Solar PV
Jupyter Notebook
5
star
12

DiffuseDenoiseCount

Python
1
star