RT-GENE & RT-BENE: Real-Time Eye Gaze and Blink Estimation in Natural Environments
This repository contains code and dataset references for two papers: RT-GENE (Gaze Estimation; ECCV2018) and RT-BENE (Blink Estimation; ICCV2019 Workshops).
RT-GENE (Gaze Estimation)
License + Attribution
The RT-GENE code is licensed under CC BY-NC-SA 4.0. Commercial usage is not permitted. If you use this dataset or the code in a scientific publication, please cite the following paper:
@inproceedings{FischerECCV2018,
author = {Tobias Fischer and Hyung Jin Chang and Yiannis Demiris},
title = {{RT-GENE: Real-Time Eye Gaze Estimation in Natural Environments}},
booktitle = {European Conference on Computer Vision},
year = {2018},
month = {September},
pages = {339--357}
}
This work was supported in part by the Samsung Global Research Outreach program, and in part by the EU Horizon 2020 Project PAL (643783-RIA).
Overview + Accompanying Dataset
The code is split into four parts, each having its own README contained. There is also an accompanying dataset (alternative link) to the code. For more information, other datasets and more open-source software please visit the Personal Robotic Lab's website: https://www.imperial.ac.uk/personal-robotics/software/.
RT-GENE ROS package
The rt_gene directory contains a ROS package for real-time eye gaze and blink estimation. This contains all the code required at inference time.
RT-GENE Standalone Version
The rt_gene_standalone directory contains instructions for eye gaze estimation given a set of images. It shares code with the rt_gene
package (above), in particular the code in rt_gene/src/rt_gene.
RT-GENE Inpainting
The rt_gene_inpainting directory contains code to inpaint the region covered by the eyetracking glasses.
RT-GENE Model Training
The rt_gene_model_training directory allows using the inpainted images to train a deep neural network for eye gaze estimation.
RT-BENE (Blink Estimation)
License + Attribution
The RT-BENE code is licensed under CC BY-NC-SA 4.0. Commercial usage is not permitted. If you use our blink estimation code or dataset, please cite the relevant paper:
@inproceedings{CortaceroICCV2019W,
author={Kevin Cortacero and Tobias Fischer and Yiannis Demiris},
booktitle = {Proceedings of the IEEE International Conference on Computer Vision Workshops},
title = {RT-BENE: A Dataset and Baselines for Real-Time Blink Estimation in Natural Environments},
year = {2019},
}
RT-BENE was supported by the EU Horizon 2020 Project PAL (643783-RIA) and a Royal Academy of Engineering Chair in Emerging Technologies to Yiannis Demiris.
Overview + Accompanying Dataset
The code is split into several parts, each having its own README. There is also an associated RT-BENE dataset. For more information, other datasets and more open-source software please visit the Personal Robotic Lab's website: https://www.imperial.ac.uk/personal-robotics/software/. Please note that a lot of the code is shared with RT-GENE (see above), hence there are many references to RT-GENE below.
RT-BENE ROS package
The rt_gene directory contains a ROS package for real-time eye gaze and blink estimation. This contains all the code required at inference time. For blink estimation, please refer to the estimate_blink.py file.
RT-BENE Standalone Version
The rt_bene_standalone directory contains instructions for blink estimation given a set of images. It makes use of the code in rt_gene/src/rt_bene.
RT-BENE Model Training
The rt_bene_model_training directory contains the code required to train models with the labels contained in the RT-BENE dataset (see below). We will soon at evaluation code in this directory, too.
RT-BENE Dataset
We manually annotated images contained in the "noglasses" part of the RT-GENE dataset. The RT-BENE dataset on Zenodo contains the eye image patches and associated annotations to train the blink models.