Awesome
Recurrent-Affine-Transformation-for-Text-to-image-Synthesis
Official Pytorch implementation for our paper Recurrent-Affine-Transformation-for-Text-to-image-Synthesis
Examples
Requirements
- python 3.8
- Pytorch 1.11.0+cu113
- easydict
- nltk
- scikit-image
- A 2080 TI (set nf=32 in *.yaml) or a 3090 32GB (set nf=64 in *.yaml)
Note that nf=32 produces a IS around 5.0 on CUB. To reproduce the final results, please use a GPU more than 32GB.
Installation
Clone this repo.
git clone https://github.com/senmaoy/RAT-GAN.git
cd RAT-GAN/code/
Datasets Preparation
- Download the preprocessed metadata for birds coco and save them to
data/
- Download the birds image data. Extract them to
data/birds/
.Raw text data of CUB dataset is avaiable here - Download coco dataset and extract the images to
data/coco/
- Download flower dataset and extract the images to
data/flower/
. Raw text data of flower dataset is avaiable here
Note that flower dataset is a bit different from cub and coco with a standalone dataset processing script.
It's easy to train on your own Datasets (similar to the processing for flower dataset)
- Prepare a captions.pickle containing all the image paths. Note that captions.pickle should be prepared by yourself.
- Save captions.pickle under data_dir.
- Put all the captions of an image in a standalone txt file (one caption per line). This txt file will be later read by dataset_flower.py in line 149: cap_path = '%s/%s.txt' % ('/home/yesenmao/dataset/flower/jpg_text/', filenames['img'][i])
- Run main.py as usual. Dataset_flower.py will automatically process your own dataset.
Pre-trained text encoder
- Download the pre-trained text encoder for CUB and save it to
../bird/
- Download the pre-trained text encoder for coco and save it to
../bird/
- Download the pre-trained text encoder for flower and save it to
../bird/
Training
Train RAT-GAN models:
-
For bird dataset:
python main.py --cfg cfg/bird.yml
-
For coco dataset:
python main.py --cfg cfg/coco.yml
-
For flower dataset:
python main.py --cfg cfg/flower.yml
-
*.yml
files are example configuration files for training/evaluation our models.
Evaluating
Dwonload Pretrained Model
- RAT-GAN for bird. Download and save it to
models/bird/
- RAT-GAN for coco. Download and save it to
models/coco/
- RAT-GAN for flower. Download and save it to
models/flower/
Evaluate RAT-GAN models:
- To evaluate our RAT-GAN on CUB, change B_VALIDATION to True in the bird.yml. and then run
python main.py --cfg cfg/bird.yml
- To evaluate our RAT-GAN on coco, change B_VALIDATION to True in the coco.yml. and then run
python main.py --cfg cfg/coco.yml
- We compute inception score for models trained on birds using StackGAN-inception-model.
- We compute FID for CUB and coco using (https://github.com/senmaoy/Inception-Score-FID-on-CUB-and-OXford.git).
Citing RAT-GAN
If you find RAT-GAN useful in your research, please consider citing our paper:
@article{ye2022recurrent,
title={Recurrent Affine Transformation for Text-to-image Synthesis},
author={Ye, Senmao and Liu, Fei and Tan, Minkui},
journal={arXiv preprint arXiv:2204.10482},
year={2022}
}
If you are interseted, join us on Wechat group where a dozen of t2i partners are waiting for you! If the QR code is expired, you can add this wechat: Unsupervised2020
The code is released for academic research use only. Please contact me through senmaoy@gmail.com
Reference
- DF-GAN: DF-GAN: A Simple and Effective Baseline for Text-to-Image Synthesis [code]
- StackGAN++: Realistic Image Synthesis with Stacked Generative Adversarial Networks [code]
- AttnGAN: Fine-Grained Text to Image Generation with Attentional Generative Adversarial Networks [code]
- DM-GAN: Realistic Image Synthesis with Stacked Generative Adversarial Networks [code]