Home

Awesome

Markov Chain GAN (MGAN)

TensorFlow code for Generative Adversarial Training for Markov Chains (ICLR 2017 Workshop Track).

Work by Jiaming Song, Shengjia Zhao and Stefano Ermon.

<br/>

Preprocessing

Running the code requires some preprocessing. Namely, we transform the data to TensorFlow Records file to maximize speed (as suggested by TensorFlow).

MNIST

The data used for training is here. Download and place the directory in ~/data/mnist_tfrecords.

(This can be easily done by using a symlink or you can change the path in file models/mnist/__init__.py)

CelebA

The data used for training is here. Download and place the directory in ~/data/celeba_tfrecords.

<br/>

Running Experiments

python mgan.py [data] [model] -b [B] -m [M] -d [critic iterations] --gpus [gpus]

where B defines the steps from noise to data, M defines the steps from data to data, and [gpus] defines the CUDA_VISIBLE_DEVICES environment variable.

MNIST

python mgan.py mnist mlp -b 4 -m 3 -d 7 --gpus [gpus]

CelebA

Without shortcut connections:

python mgan.py celeba conv -b 4 -m 3 -d 7 --gpus [gpus]

With shortcut connections (will observe a much slower transition):

python mgan.py celeba conv_res -b 4 -m 3 -d 7 --gpus [gpus]

Custom Experiments

It is easy to define your own problem and run experiments.

<br/>

Figures

Each row is from a single chain, where we sample for 50 time steps.

MNIST

MNIST MLP

CelebA

Without shortcut connections: CelebA 1-layer conv

With shortcut connections: CelebA 1-layer conv with shortcuts

Related Projects

a-nice-mc: adversarial training for efficient MCMC kernels, which is based on this project.

Citation

If you use this code for your research, please cite our paper:

@article{song2017generative,
  title={Generative Adversarial Training for Markov Chains},
  author={Song, Jiaming and Zhao, Shengjia and Ermon, Stefano},
  journal={ICLR 2017 (Workshop Track)},
  year={2017}
}

Contact

tsong@cs.stanford.edu

Code for the Pairwise Discriminator is not available at this moment; I will add that when I have the time.