Home

Awesome

BEGAN : Boundary Equilibrium Generative Adversarial Networks

<br>

Overview

Pytorch implementation of BEGAN(arxiv:1703.10717) overview

Objectives

objective

Architectures

architecture

Measure of Convergence

moc <br>

Dependencies

python 3.6.4
pytorch 0.3.1.post2
visdom
<br>

Usage

initialize visdom.

python -m visdom.server

train using CIFAR10 dataset. checkpoint will automatically be saved in checkpoint/run1 for every epoch.

python main.py --model_type skip_repeat --dataset cifar10 --env_name run1

you can load checkpoint and continue training. make sure --env_name matched to previous runs.

python main.py --model_type skip_repeat --dataset cifar10 --env_name run1 --load_ckpt True

you can check the training process.

localhost:8097

you can also train using your own dataset. make sure your dataset is appropriate for pytorch ImageFolder class. please check data directory tree below.

python main.py --model_type skip_repeat --dataset custom_dataset --env_name run1
<br>

data directory tree

.
└── data
    └── CelebA
        └── img_align_celeba
            ├── 000001.jpg
            ├── 000002.jpg
            ├── ...
            └── 202599.jpg
    ├── custom_dataset
        └── folder1
            ├── image1.jpg
            ├── ...
    └── ...
<br>

Results : CIFAR10(32x32)

python main.py --dataset cifar10 --image_size 32 --batch_size 16 --model_type skip_repeat --hidden_dim 64 --n_filter 32 --n_repeat 2

fixed generation

cifar10_fixed

random generation

cifar10_random

measure of convergence

cifar10_moc <br>

Results : CelebA(aligned, 64x64)

(you can download CelebA dataset here)

python main.py --dataset celeba --image_size 64 --batch_size 16 --model_type skip_repeat --hidden_dim 64 --n_filter 64 --n_repeat 2

fixed generation

celeba_fixed

random generation

celeba_random

interpoloation

celeba_interpolation1 celeba_interpolation2 celeba_interpolation3 <br>

References

  1. BEGAN : Boundary Equilibrium Generative Adversarial Networks(arxiv:1703.10717)