Offical implementation for "Trash or Treasure? An Interactive Dual-Stream Strategy for Single Image Reflection Separation".

Overview

Trash or Treasure? An Interactive Dual-Stream Strategy for Single Image Reflection Separation (NeurIPS 2021)

by Qiming Hu, Xiaojie Guo.

Dependencies

  • Python3
  • PyTorch>=1.0
  • OpenCV-Python, TensorboardX, Visdom
  • NVIDIA GPU+CUDA

Network Architecture

figure_arch

🚀 1. Single Image Reflection Separation

Data Preparation

Training dataset

  • 7,643 images from the Pascal VOC dataset, center-cropped as 224 x 224 slices to synthesize training pairs.
  • 90 real-world training pairs provided by Zhang et al.

Tesing dataset

  • 45 real-world testing images from CEILNet dataset.
  • 20 real testing pairs provided by Zhang et al.
  • 454 real testing pairs from SIR^2 dataset, containing three subsets (i.e., Objects (200), Postcard (199), Wild (55)).

Usage

Training

  • For stage 1: python train_sirs.py --inet ytmt_ucs --model ytmt_model_sirs --name ytmt_ucs_sirs --hyper --if_align
  • For stage 2: python train_twostage_sirs.py --inet ytmt_ucs --model twostage_ytmt_model --name ytmt_uct_sirs --hyper --if_align --resume --resume_epoch xx --checkpoints_dir xxx

Testing

python test_sirs.py --inet ytmt_ucs --model twostage_ytmt_model --name ytmt_uct_sirs_test --hyper --if_align --resume --icnn_path ./checkpoints/ytmt_uct_sirs/twostage_unet_68_077_00595364.pt

Trained weights

Google Drive

Visual comparison on real20 and SIR^2

figure_eval

Visual comparison on real45

figure_test

🚀 2. Single Image Denoising

Data Preparation

Training datasets

400 images from the Berkeley segmentation dataset, following DnCNN.

Tesing datasets

BSD68 dataset and Set12.

Usage

Training

python train_denoising.py --inet ytmt_pas --name ytmt_pas_denoising --preprocess True --num_of_layers 9 --mode B --preprocess True

Testing

python test_denoising.py --inet ytmt_pas --name ytmt_pas_denoising_blindtest_25 --test_noiseL 25 --num_of_layers 9 --test_data Set68 --icnn_path ./checkpoints/ytmt_pas_denoising_49_157500.pt

Trained weights

Google Drive

Visual comparison on a sample from BSD68

figure_eval_denoising

🚀 3. Single Image Demoireing

Data Preparation

Training dataset

AIM 2019 Demoireing Challenge

Tesing dataset

100 moireing and clean pairs from AIM 2019 Demoireing Challenge.

Usage

Training

python train_demoire.py --inet ytmt_ucs --model ytmt_model_demoire --name ytmt_uas_demoire --hyper --if_align

Testing

python test_demoire.py --inet ytmt_ucs --model ytmt_model_demoire --name ytmt_uas_demoire_test --hyper --if_align --resume --icnn_path ./checkpoints/ytmt_ucs_demoire/ytmt_ucs_opt_086_00860000.pt

Trained weights

Google Drive

Visual comparison on the validation set of LCDMoire

figure_eval_demoire

You might also like...
Image-to-Image Translation with Conditional Adversarial Networks (Pix2pix) implementation in keras

pix2pix-keras Pix2pix implementation in keras. Original paper: Image-to-Image Translation with Conditional Adversarial Networks (pix2pix) Paper Author

Python implementation of cover trees, near-drop-in replacement for scipy.spatial.kdtree

This is a Python implementation of cover trees, a data structure for finding nearest neighbors in a general metric space (e.g., a 3D box with periodic

Home repository for the Regularized Greedy Forest (RGF) library. It includes original implementation from the paper and multithreaded one written in C++, along with various language-specific wrappers.

Regularized Greedy Forest Regularized Greedy Forest (RGF) is a tree ensemble machine learning method described in this paper. RGF can deliver better r

Implementation of Restricted Boltzmann Machine (RBM) and its variants in Tensorflow

xRBM Library Implementation of Restricted Boltzmann Machine (RBM) and its variants in Tensorflow Installation Using pip: pip install xrbm Examples Tut

A fast Evolution Strategy implementation in Python

Evostra: Evolution Strategy for Python Evolution Strategy (ES) is an optimization technique based on ideas of adaptation and evolution. You can learn

🌳 A Python-inspired implementation of the Optimum-Path Forest classifier.

OPFython: A Python-Inspired Optimum-Path Forest Classifier Welcome to OPFython. Note that this implementation relies purely on the standard LibOPF. Th

Implementation of Geometric Vector Perceptron, a simple circuit for 3d rotation equivariance for learning over large biomolecules, in Pytorch. Idea proposed and accepted at ICLR 2021
Implementation of Geometric Vector Perceptron, a simple circuit for 3d rotation equivariance for learning over large biomolecules, in Pytorch. Idea proposed and accepted at ICLR 2021

Geometric Vector Perceptron Implementation of Geometric Vector Perceptron, a simple circuit with 3d rotation equivariance for learning over large biom

Official implementation of AAAI-21 paper
Official implementation of AAAI-21 paper "Label Confusion Learning to Enhance Text Classification Models"

Description: This is the official implementation of our AAAI-21 accepted paper Label Confusion Learning to Enhance Text Classification Models. The str

Official PyTorch implementation for paper Context Matters: Graph-based Self-supervised Representation Learning for Medical Images
Official PyTorch implementation for paper Context Matters: Graph-based Self-supervised Representation Learning for Medical Images

Context Matters: Graph-based Self-supervised Representation Learning for Medical Images Official PyTorch implementation for paper Context Matters: Gra

Comments
  • Datasets

    Datasets

    Hi,

    I have been trying to experiment with the model but I'm having trouble finding the correct datasets for testing. The Sirs2 dataset in the provided link doesn't have the images set up with the naming conventions used in the script. Could you please direct me to the correct data sets for testing and training? Is there a separate repository that you have used?

    Thanks so much,

    David

    opened by davidgaddie 3
  • About Training Details

    About Training Details

    Hello, thank you for sharing your wonderful work. I have some question about the triaining details. It says the training epoch is 120 in your paper but the epoch is set to 60 in YTMT-Strategy/options/net_options/train_options.py. Moreover, the best model in your paper is YTMT-UCT which need two stages training. Can you provide the training settings of the YTMT-UCT (epoch, batchsize...)? Look forward to your reply!

    opened by DUT-CSJ 2
  • CUDA vram allocation issue

    CUDA vram allocation issue

    Hi,

    I've been trying to run the reflection test code, but I get this error: RuntimeError: CUDA out of memory. Tried to allocate 15.66 GiB (GPU 0; 22.20 GiB total capacity; 16.09 GiB already allocated; 2.68 GiB free; 17.55 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

    I'm running on an A10G GPU on AWS. I suspect that maybe the dataset is incorrect as each image in the dataset I have is around 800MB. If that's the case can I please be directed to the correct repository for the read20_420 images?

    Thanks so much,

    David

    opened by davidgaddie 1
  • test demoire error

    test demoire error

    Thanks for your great work ,but some error when I run: python test_demoire.py --inet ytmt_ucs --model ytmt_model_demoire --name ytmt_uas_demoire_test --hyper --if_align --resume --icnn_path checkpoints/ytmt_ucs_demoire/ytmt_ucs_demoire_opt_086_00860000.pt

    -------------- End ---------------- [i] initialization method [edsr] Traceback (most recent call last): File "test_demoire.py", line 28, in engine = Engine(opt) File "/nfs_data/code/YTMT-Strategy-main/engine.py", line 19, in init self.__setup() File "/nfs_data/code/YTMT-Strategy-main/engine.py", line 29, in __setup self.model.initialize(opt) File "/nfs_data/code/YTMT-Strategy-main/models/ytmt_model_demoire.py", line 242, in initialize self.load(self, opt.resume_epoch) File "/nfs_data/code/YTMT-Strategy-main/models/ytmt_model_demoire.py", line 413, in load model.net_i.load_state_dict(state_dict['icnn']) File "/opt/conda/envs/torch/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1223, in load_state_dict raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format( RuntimeError: Error(s) in loading state_dict for YTMT_US: Missing key(s) in state_dict: "inc.ytmt_head.fusion_l.weight", "inc.ytmt_head.fusion_l.bias", "inc.ytmt_head.fusion_r.weight", "inc.ytmt_head.fusion_r.bias", "down1.model.ytmt_head.fusion_l.weight", "down1.model.ytmt_head.fusion_l.bias", "down1.model.ytmt_head.fusion_r.weight", "down1.model.ytmt_head.fusion_r.bias", "down2.model.ytmt_head.fusion_l.weight", "down2.model.ytmt_head.fusion_l.bias", "down2.model.ytmt_head.fusion_r.weight", "down2.model.ytmt_head.fusion_r.bias", "down3.model.ytmt_head.fusion_l.weight", "down3.model.ytmt_head.fusion_l.bias", "down3.model.ytmt_head.fusion_r.weight", "down3.model.ytmt_head.fusion_r.bias", "down4.model.ytmt_head.fusion_l.weight", "down4.model.ytmt_head.fusion_l.bias", "down4.model.ytmt_head.fusion_r.weight", "down4.model.ytmt_head.fusion_r.bias", "up1.model.ytmt_head.fusion_l.weight", "up1.model.ytmt_head.fusion_l.bias", "up1.model.ytmt_head.fusion_r.weight", "up1.model.ytmt_head.fusion_r.bias", "up2.model.ytmt_head.fusion_l.weight", "up2.model.ytmt_head.fusion_l.bias", "up2.model.ytmt_head.fusion_r.weight", "up2.model.ytmt_head.fusion_r.bias", "up3.model.ytmt_head.fusion_l.weight", "up3.model.ytmt_head.fusion_l.bias", "up3.model.ytmt_head.fusion_r.weight", "up3.model.ytmt_head.fusion_r.bias", "up4.model.ytmt_head.fusion_l.weight", "up4.model.ytmt_head.fusion_l.bias", "up4.model.ytmt_head.fusion_r.weight", "up4.model.ytmt_head.fusion_r.bias".

    opened by zdyshine 1
Owner
Qiming Hu
Qiming Hu
Justmagic - Use a function as a method with this mystic script, like in Nim

justmagic Use a function as a method with this mystic script, like in Nim. Just

witer33 8 Oct 08, 2022
HTSeq is a Python library to facilitate processing and analysis of data from high-throughput sequencing (HTS) experiments.

HTSeq DEVS: https://github.com/htseq/htseq DOCS: https://htseq.readthedocs.io A Python library to facilitate programmatic analysis of data from high-t

HTSeq 57 Dec 20, 2022
A diff tool for language models

LMdiff Qualitative comparison of large language models. Demo & Paper: http://lmdiff.net LMdiff is a MIT-IBM Watson AI Lab collaboration between: Hendr

Hendrik Strobelt 27 Dec 29, 2022
An implementation of the research paper "Retina Blood Vessel Segmentation Using A U-Net Based Convolutional Neural Network"

Retina Blood Vessels Segmentation This is an implementation of the research paper "Retina Blood Vessel Segmentation Using A U-Net Based Convolutional

Srijarko Roy 23 Aug 20, 2022
Statistical-Rethinking-with-Python-and-PyMC3 - Python/PyMC3 port of the examples in " Statistical Rethinking A Bayesian Course with Examples in R and Stan" by Richard McElreath

Statistical Rethinking with Python and PyMC3 This repository has been deprecated in favour of this one, please check that repository for updates, for

Osvaldo Martin 786 Dec 29, 2022
Semantic Segmentation Architectures Implemented in PyTorch

pytorch-semseg Semantic Segmentation Algorithms Implemented in PyTorch This repository aims at mirroring popular semantic segmentation architectures i

Meet Shah 3.3k Dec 29, 2022
GLODISMO: Gradient-Based Learning of Discrete Structured Measurement Operators for Signal Recovery

GLODISMO: Gradient-Based Learning of Discrete Structured Measurement Operators for Signal Recovery This is the code to the paper: Gradient-Based Learn

3 Feb 15, 2022
Repository for the paper "Online Domain Adaptation for Occupancy Mapping", RSS 2020

RSS 2020 - Online Domain Adaptation for Occupancy Mapping Repository for the paper "Online Domain Adaptation for Occupancy Mapping", Robotics: Science

Anthony 26 Sep 22, 2022
The FIRST GANs-based omics-to-omics translation framework

OmiTrans Please also have a look at our multi-omics multi-task DL freamwork 👀 : OmiEmbed The FIRST GANs-based omics-to-omics translation framework Xi

Xiaoyu Zhang 6 Dec 14, 2022
An AI made using artificial intelligence (AI) and machine learning algorithms (ML) .

DTech.AIML An AI made using artificial intelligence (AI) and machine learning algorithms (ML) . This is created by help of some members in my team and

1 Jan 06, 2022
Official implementation for CVPR 2021 paper: Adaptive Class Suppression Loss for Long-Tail Object Detection

Adaptive Class Suppression Loss for Long-Tail Object Detection This repo is the official implementation for CVPR 2021 paper: Adaptive Class Suppressio

CASIA-IVA-Lab 67 Dec 04, 2022
PointRCNN: 3D Object Proposal Generation and Detection from Point Cloud, CVPR 2019.

PointRCNN PointRCNN: 3D Object Proposal Generation and Detection from Point Cloud Code release for the paper PointRCNN:3D Object Proposal Generation a

Shaoshuai Shi 1.5k Dec 27, 2022
[ICLR 2021] Is Attention Better Than Matrix Decomposition?

Enjoy-Hamburger 🍔 Official implementation of Hamburger, Is Attention Better Than Matrix Decomposition? (ICLR 2021) Under construction. Introduction T

Gsunshine 271 Dec 29, 2022
Morphable Detector for Object Detection on Demand

Morphable Detector for Object Detection on Demand (ICCV 2021) PyTorch implementation of the paper Morphable Detector for Object Detection on Demand. I

9 Feb 23, 2022
NLMpy - A Python package to create neutral landscape models

NLMpy is a Python package for the creation of neutral landscape models that are widely used by landscape ecologists to model ecological patterns

Manaaki Whenua – Landcare Research 1 Oct 08, 2022
Code for the TIP 2021 Paper "Salient Object Detection with Purificatory Mechanism and Structural Similarity Loss"

PurNet Project for the TIP 2021 Paper "Salient Object Detection with Purificatory Mechanism and Structural Similarity Loss" Abstract Image-based salie

Jinming Su 4 Aug 25, 2022
The repo contains the code of the ACL2020 paper `Dice Loss for Data-imbalanced NLP Tasks`

Dice Loss for NLP Tasks This repository contains code for Dice Loss for Data-imbalanced NLP Tasks at ACL2020. Setup Install Package Dependencies The c

223 Dec 17, 2022
source code the paper Fast and Robust Iterative Closet Point.

Fast-Robust-ICP This repository includes the source code the paper Fast and Robust Iterative Closet Point. Authors: Juyong Zhang, Yuxin Yao, Bailin De

yaoyuxin 320 Dec 28, 2022
Implementation of Invariant Point Attention, used for coordinate refinement in the structure module of Alphafold2, as a standalone Pytorch module

Invariant Point Attention - Pytorch Implementation of Invariant Point Attention as a standalone module, which was used in the structure module of Alph

Phil Wang 113 Jan 05, 2023
The official implementation of our CVPR 2021 paper - Hybrid Rotation Averaging: A Fast and Robust Rotation Averaging Approach

Graph Optimizer This repo contains the official implementation of our CVPR 2021 paper - Hybrid Rotation Averaging: A Fast and Robust Rotation Averagin

Chenyu 109 Dec 23, 2022