Home

Awesome

CycleGAN in TensorFlow

[update 9/26/2017] We observed faster convergence and better performance after adding skip connection between input and output in the generator. To turn the feature on, use switch --skip=True. This is the result of turning on skip after training for 23 epochs:

<img src='imgs/skip_result.jpg' width="900px"/>

This is the TensorFlow implementation for CycleGAN. The code was written by Harry Yang and Nathan Silberman.

CycleGAN: [Project] [Paper]

Introduction

This code contains two versions of the network architectures and hyper-parameters. The first one is based on the TensorFlow implementation. The second one is based on the official PyTorch implementation. The differences are minor and we observed both versions produced good results. You may need to train several times as the quality of the results are sensitive to the initialization.

Below is a snapshot of our result at the 50th epoch on one training instance:

<img src='imgs/horse2zebra.png' width="900px"/>

Getting Started

Prepare dataset

Training

python -m CycleGAN_TensorFlow.main \
    --to_train=1 \
    --log_dir=CycleGAN_TensorFlow/output/cyclegan/exp_01 \
    --config_filename=CycleGAN_TensorFlow/configs/exp_01.json

Restoring from the previous checkpoint.

python -m CycleGAN_TensorFlow.main \
    --to_train=2 \
    --log_dir=CycleGAN_TensorFlow/output/cyclegan/exp_01 \
    --config_filename=CycleGAN_TensorFlow/configs/exp_01.json \
    --checkpoint_dir=CycleGAN_TensorFlow/output/cyclegan/exp_01/#timestamp#

Testing

python -m CycleGAN_TensorFlow.main \
    --to_train=0 \
    --log_dir=CycleGAN_TensorFlow/output/cyclegan/exp_01 \
    --config_filename=CycleGAN_TensorFlow/configs/exp_01_test.json \
    --checkpoint_dir=CycleGAN_TensorFlow/output/cyclegan/exp_01/#old_timestamp# 

The result is saved in CycleGAN_TensorFlow/output/cyclegan/exp_01/#new_timestamp#.