GazeAnimation - Official Tensorflow Implementation
Dual In-painting Model for Unsupervised Gaze Correction and Animation in the Wild
Jichao Zhang, Jingjing Chen, Hao Tang, Wei Wang, Yan Yan, Enver Sangineto, Nicu Sebe
In ACM MM 2020.
Network Architecture
Dependencies
Python=3.6
pip install -r requirements.txt
Or Using Conda
-conda create -name GazeA python=3.6
-conda install tensorflow-gpu=1.9 or higher
Other packages installed by pip.
Usage
- Clone this repo:
git clone https://github.com/zhangqianhui/GazeAnimation.git
cd GazeAnimation
-
Download the CelebAGaze dataset
Download the tar of CelebAGaze dataset from Google Driver Linking.
cd your_path tar -xvf CelebAGaze.tar
Please edit the options.py and change your dataset path
-
VGG-16 pretrained weights
wget http://download.tensorflow.org/models/vgg_16_2016_08_28.tar.gz .
tar -xvf vgg_16_2016_08_28.tar.gz
Please edit the options.py and change your vgg path
- Pretrained model for PAM module.
Download it from PAM Pretrained model. PLease unzip it in pam_dir and don't contain the sub-dir.
- Train the model using command line with python
python train.py --use_sp --gpu_id='0' --exper_name='log8_7' --crop_w=50 --crop_h=30
- Test the model
python test.py --exper_name='log8_7' --gpu_id='0' --crop_h=30 --crop_w=50 --test_sample_dir='test_sample_dir' --checkpoints='checkpoints'
Or Using scripts for training
bash scripts/train_log8_7.sh
Using scripts for testing and pretained model can be downloaded
[V1]
[V2]. Unzip
pretrained.zip and move files into 'experiments/checkpoints'
bash scripts/test_log8_7.sh
Experiment Result
Gaze Correction
Gaze Animation
Citation
@inproceedings{zhangGazeAnimation,
title={Dual In-painting Model for Unsupervised Gaze Correction and Animation in the Wild},
author={Jichao Zhang, Jingjing Chen, Hao Tang, Wei Wang, Yan Yan, Enver Sangineto, Nicu Sebe},
booktitle={ACM MM},
year={2020}
}