• Stars
    star
    121
  • Rank 293,924 (Top 6 %)
  • Language
    Python
  • License
    BSD 2-Clause "Sim...
  • Created over 4 years ago
  • Updated almost 3 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Single Image Reflection Removal through Cascaded Refinement, CVPR 2020

Single Image Reflection Removal through Cascaded Refinement

We provide code and our dataset for the Paper:

Single Image Reflection Removal through Cascaded Refinement
Chao Li, Yixiao Yang, Kun He, Stephen Lin, John E. Hopcroft
[CVPR 2020]

Introduction

For single-image reflection removal (SIRR), researchers have observed that some handcrafted priors may help for distinguishing the transmission layer from the reflection layer in a single image. But these priors often do not generalize well to different types of reflections and scenes owing to disparate imaging conditions. In recent years, researchers apply data-driven learning to replace handcrafted priors via deep convolutional neural networks. With abundant labeled data, a network can be trained to perform effectively over a broad range of scenes. However, learning-based single image methods still have much room for improvement due to complications such as limited training data, disparate imaging conditions, varying scene content, limited physical understanding of this problem, and the performance limitation of various models.

In this work, inspired by the iterative structure reduction approach for hidden community detection in social networks, we introduce a cascaded neural network model for transmission and reflection decomposition. To the best of our knowledge, previous works on reflection removal did not utilize a cascaded refinement approach.

For a cascade model on SIRR, a simple approach is to employ one network to generate a predicted transmission that serves as the auxiliary information of the next network, and continue such process with subsequent networks to iteratively improve the prediction quality. With a long cascade, however, the training becomes difficult due to the vanishing gradient problem and limited training guidance at each step. To address this issue, we design a convolutional LSTM (Long Short-Term Memory) network, which saves information from the previous iteration (i.e. time step) and allows gradients to flow unchanged.

Our main contributions are as follows:

  1. We propose a new network architecture, a cascaded network, with loss components that achieves state-of-the-art quantitative results on real-world benchmarks for the single image reflection removal problem.

  2. We design a residual reconstruction loss, which can form a closed loop with the linear method for synthesizing images with reflections, to expand the influence of the synthesis method across the whole network.

  3. We collect a new real-world dataset containing images with densely-labeled ground-truth, which can serve as baseline data in future research.

Requisites

  • PyTorch Use the instructions that are outlined on PyTorch Homepage for installing PyTorch for your operating system
  • Python 3
  • Linux
  • CPU or NVIDIA GPU + CUDA CuDNN

Quick Start

Prepare Datasets

Download and unzip our created dataset and images from Zhang et al., and then copy them to datasets/reflection.

Test

  • Download Pre-trained Model

Download and unzip our pre-trained model, and then copy them to checkpoints/IBCLN.

  • Prepare data

Copy image pairs T and I you want to test to datasets/reflection/testA1 and datasets/reflection/testB separately. If T image is not given, the ground truth will be an all black image.

  • Run

You can run bash test.sh

or equivalently:

python test.py --dataroot datasets/reflection --name IBCLN --model IBCLN --dataset_mode resize_natural_3 --preprocess "" --no_flip --epoch final --gpu_ids 0

Train

  • Prepare data

Make two directories trainA1, trainA2 in datasets/reflection i.e.

cd datasets/reflection
mkdir trainA1
mkdir trainA2

Prepare some images for synthesis. Copy T and R images to trainA1 and trainA2 separately.

  • Run

You can run bash train.sh

or equivalently:

python train.py --dataroot datasets/reflection --name IBCLN --model IBCLN --dataset_mode resize_natural_3 --no_flip --gpu_id 0 --display_id -1

Description of the files in this repository

  1. train.py: Execute this file to train the model
  2. test.py: Execute this file to test the model
  3. model/IBCLN_model.py: Contains the class defining our model
  4. model/networks.py: Contains the function that defining the networks and losses.
  5. options/base_options.py: This file contains the basic options
  6. options/train_options.py: This file contains the options for training
  7. options/test_options.py: This file contains the options for testing

Acknowledgement

Our code architecture is inspired by pytorch-CycleGAN-and-pix2pix.

Citation

If you find this code and data useful, please consider citing the original work by authors:

@article{li2019single,
  title={Single Image Reflection Removal through Cascaded Refinement},
  author={Li, Chao and Yang, Yixiao and He, Kun and Lin, Stephen and Hopcroft, John E},
  journal={arXiv preprint arXiv:1911.06634},
  year={2019}
}

More Repositories

1

NAGphormer

NAGphormer: A Tokenized Graph Transformer for Node Classification in Large Graphs
Python
108
star
2

VT

Enhancing the Transferability of Adversarial Attacks through Variance Tuning
Python
80
star
3

SI-NI-FGSM

Python
67
star
4

PWWS

Generating Natural Language Adversarial Examples through Probability Weighted Word Saliency
Python
67
star
5

TITer

Python
42
star
6

VSR-LKH

Combining Reinforcement Learning with Lin-Kernighan-Helsgaun Algorithm for the Traveling Salesman Problem
C
41
star
7

Admix

Python
32
star
8

ATDA

Improving the Generalization of Adversarial Training with Domain Adaptation
Python
31
star
9

VSR-LKH-V2

Reinforced Lin-Kernighan-Helsgaun Algorithms for the Traveling Salesman Problem and its Variants
C
24
star
10

SVRE

Stochastic Variance Reduced Ensemble Adversarial Attack for Boosting the Adversarial Transferability
Python
23
star
11

FGPM

Adversarial Training with Fast Gradient Projection Method against Synonym Substitution based Text Attacks
Python
23
star
12

GHT

Graph Hawkes Transformer for Extrapolated Reasoning on Temporal Knowledge Graphs
Python
18
star
13

CircularKernel

Integrating Large Circular Kernels into CNNs through Neural Architecture Search
Python
16
star
14

EMI

Boosting Transferability through Enhanced Momentum
Python
14
star
15

RLFAT

Python
13
star
16

HiCode

Hidden Community Detection in Social Networks
C++
12
star
17

LMSPS

Long-range Meta-path Search through Progressive Sampling on Large-scale Heterogeneous Information Networks
Python
10
star
18

BandMaxSAT

BandMaxSAT: Multi-armed Bandit for the Local Search MaxSAT Solver
C++
10
star
19

AdvNMT-WSLS

Crafting Adversarial Examples for Neural Machine Translation
Python
10
star
20

MCDformer

Multi-channel Convolutional Distilled Transformer for Automatic Modulation Classification
Python
8
star
21

LOSP

Detecting Overlapping Communities from Local Spectral Subspaces
MATLAB
7
star
22

SEM

Python
7
star
23

PMMM

Differentiable Meta Multigraph Search with Partial Message Propagation on Heterogeneous Information Networks
Jupyter Notebook
7
star
24

FPS

C++
5
star
25

FTML

Robust Textual Embedding against Word-level Adversarial Attacks
Python
5
star
26

McSplit-RL

A Learning based Branch and Bound for Maximum Common Subgraph related Problems
C++
5
star
27

BandHS

BandMaxSAT: A Local Search MaxSAT Solver with Multi-armed Bandit
C
5
star
28

TextHacker

TextHacker: Learning based Hybrid Local Search Algorithm for Hard-label Text Adversarial Attack
Python
4
star
29

RGB

Relaxed Graph Color Bound for the Maximum $k$-plex Problem
C++
4
star
30

RSV

Detecting Textual Adversarial Examples through Randomized Substitution and Vote
Python
3
star
31

McSplit-LL

A Strengthened Branch and Bound Algorithm for Maximum Common Subgraph Problems
C++
3
star
32

HosIM

HoSIM: Higher-order Structural Importance based Method for Multiple Local Community Detection
Python
3
star
33

SparseMA

Sparse Black-Box Multimodal Attack for Vision-Language Adversary Generation
Python
3
star
34

VDLS

Effective Variable Depth Local Search for the Budgeted Maximum Coverage Problem
2
star
35

AdaWaveClustering

Adaptive Wavelet Clustering for High Noise Data
Python
2
star
36

EIBC-IBP

Robustness-Aware Word Embedding Improves Certified Robustness to Adversarial Word Substitutions
Python
2
star
37

LOMA

Local Magnification for Data and Feature Augmentation
Python
2
star
38

CircleConvNet

2
star
39

TraSA

TraSw: Tracklet-Switch Adversarial Attacks against Multi-Object Tracking
Python
2
star
40

IMGS

Image Mixing and Gradient Smoothing to Enhance the SAR Image Attack Transferability
Python
2
star
41

KD-Club

KD-Club: An Efficient Exact Algorithm with New Coloring-based Upper Bound for the Maximum k-Defective Clique Problem
C++
2
star
42

LLSA

MATLAB
1
star
43

SPB-MaxSAT

Rethinking the Soft Conflict Pseudo Boolean Constraint on MaxSAT Local Search Solvers
C
1
star
44

LOSP_Plus

MATLAB
1
star
45

S-FGRM

Sampling-based Fast Gradient Rescaling Method for Highly Transferable Adversarial Attacks
Python
1
star
46

LHLS

Learning-based Hybrid Local Search for the Hard-label Textual Attack
Python
1
star
47

ASA-GS

C++
1
star