Awesome
GazeAnimation - Official Tensorflow Implementation
Dual In-painting Model for Unsupervised Gaze Correction and Animation in the Wild<br> Jichao Zhang, Jingjing Chen, Hao Tang, Wei Wang, Yan Yan, Enver Sangineto, Nicu Sebe<br> In ACM MM 2020.<br>
Paper: https://arxiv.org/abs/2008.03834<br>
Network Architecture
Dependencies
Python=3.6
pip install -r requirements.txt
Or Using Conda
-conda create -name GazeA python=3.6
-conda install tensorflow-gpu=1.9 or higher
Other packages installed by pip.
Usage
- Clone this repo:
git clone https://github.com/zhangqianhui/GazeAnimation.git
cd GazeAnimation
-
Download the CelebAGaze dataset
Download the tar of CelebAGaze dataset from Google Driver Linking.
cd your_path tar -xvf CelebAGaze.tar
Please edit the options.py and change your dataset path
-
VGG-16 pretrained weights
wget http://download.tensorflow.org/models/vgg_16_2016_08_28.tar.gz .
tar -xvf vgg_16_2016_08_28.tar.gz
Please edit the options.py and change your vgg path
- Pretrained model for PAM module.
Download it from PAM Pretrained model. PLease unzip it in pam_dir and don't contain the sub-dir.
- Train the model using command line with python
python train.py --use_sp --gpu_id='0' --exper_name='log8_7' --crop_w=50 --crop_h=30
- Test the model
python test.py --exper_name='log8_7' --gpu_id='0' --crop_h=30 --crop_w=50 --test_sample_dir='test_sample_dir' --checkpoints='checkpoints'
Or Using scripts for training
bash scripts/train_log8_7.sh
Using scripts for testing and pretained model can be downloaded
[V1]
[V2]. Unzip
pretrained.zip and move files into 'experiments/checkpoints'
bash scripts/test_log8_7.sh
Experiment Result
Gaze Correction
<p align="center"><img width="100%" src="img/correction.png" /></p>Gaze Animation
Citation
@inproceedings{zhangGazeAnimation,
title={Dual In-painting Model for Unsupervised Gaze Correction and Animation in the Wild},
author={Jichao Zhang, Jingjing Chen, Hao Tang, Wei Wang, Yan Yan, Enver Sangineto, Nicu Sebe},
booktitle={ACM MM},
year={2020}
}