Skip to content

About Unofficial implementation of "High-Resolution Image Harmonization via Collaborative Dual Transformations (CVPR 2022)" in PyTorch

Notifications You must be signed in to change notification settings

SuhyeonHa/CDTNet-PyTorch

Repository files navigation

CDTNet: High-Resolution Image Harmonization via Collaborative Dual Transformations (CVPR 2022)

Unofficial implementation of "High-Resolution Image Harmonization via Collaborative Dual Transformations (CVPR 2022)" in PyTorch.

Example Results

Prerequisites

  • Linux
  • Python 3
  • CPU or NVIDIA GPU + CUDA CuDNN

Datasets

Train

  1. Download HRNet-W18-C model(hrnetv2_w18_imagenet_pretrained.pth) in HRNets
  2. Put it in the pretrained folder.
  3. Run:
CUDA_VISIBLE_DEVICES="0,1,2,3,4,5,6,7" python train.py --model iih_base --name iih_base_allidh_test --dataset_root ~/IHD/ --dataset_name HAdobe5k --batch_size 80 --init_port 50000

Test

Train a new model

If you train a new model with the command above, latest_net_G.pth and latest_net_P2P.pth will be generated in the directory checkpoints/iih_base_lt_allihd.

Run:

CUDA_VISIBLE_DEVICES="0,1,2,3,4,5,6,7" python test.py --model iih_base --name iih_base_allidh_test --dataset_root ~/IHD/ --dataset_name HAdobe5k --batch_size 80 --init_port 50000

Apply pre-trained model

  1. Download pre-trained models (model_G and model_P2P)
  2. Change their names to latest_net_G.pth and latest_net_P2P.pth respectively.
  3. Put them in the directory checkpoints/iih_base_lt_allihd.
  4. Run (use the command above)

Quantitative Result

Image Size CDTNet
(officially reported)
CDTNet
(implemented)
256x256 38.24 37.42
1024x1024
(after LUT)
37.65 37.13
1024x1024 38.77 37.30

Acknowledgement

We borrowed some of the data modules and model functions from repo of IntrinsicHarmony, iSSAM, and 3DLUT.

About

About Unofficial implementation of "High-Resolution Image Harmonization via Collaborative Dual Transformations (CVPR 2022)" in PyTorch

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published