Home

Awesome

Event Based, Near Eye Gaze Tracking Beyond 10,000Hz

arXiv preprint: https://arxiv.org/abs/2004.03577

Angelopoulos*, Martel*, Kohli, Conradt, and Wetzstein

<div align="center"> <img width="100%" alt="Eye-Tracker Illustration" src="misc/github_event_based_eye_tracking_teaser.gif"> </div> <div align="center"> Our gaze tracker in action (from <a href="https://www.youtube.com/watch?v=-7EneYIfinM&feature=youtu.be">our video here</a>). </div>

<br/><br/>

This repository includes instructions for downloading and using our 27-person, near-eye, event- and frame-based gaze-tracking dataset.

Enviroment setup

conda env create -f ebv-eye.yml 

Data

Download the 27-person dataset using the setup script (you might have to change user permissions with chmod u+x setup.sh before). Note that in the paper we only use subjects 4-27 because subjects 1-3 were recorded with a slightly different, suboptimal setup.

bash setup.sh 

Sample visualization

We have provided a simple python script which reads and visualizes our data. Run it with:

python visualize.py --data_dir ./eye_data --subject 3 --eye left --buffer 1000 

buffer controls how many events are rendered as a group. Increasing it will make the rendering faster, but blockier.

This visualization is not real-time; the speed is limited by the rendering rate of matplotlib. The primary use of this visualizer is to provide a minimal example of proper data parsing and alignment.

Dataset organization

This dataset contains synchronized left and right, IR illuminated eye data from 27 subjects. The data was collected using DAVIS 364b sensors from iniVation. For additional details regarding setup and data collection, please refer to Section 4 of the associated paper.

The data is stored in the ./eye_data/ directory. This directory contains 27 subdirectories, one for each subject. Within each of these subject directories is two folders titled, 0 and 1. 0 corresponds to the left eye and 1 corresponds to the right eye. Within each of these eye directories is a frames directory for video data and an events.aerdat file for event data. Each of these formats will be explained below

<bf>Example</bf>: the video data for left eye for subject 3 is located at "eye_data/user3/0/frames/"

Event Data: The events.aerdat file contains all the event based information in a sequential, raw binary format. Every time an event was registered by the sensor, the following information was written directly to the binary file:

<bf>Example</bf>: the event data for the right eye of subject 11 is located at "eye_data/user11/1/events.aerdat"

Frame Data: The frames directory contains regular video data stored as a set of image frames. The video was taken at ~25 FPS and each frame is a 8-bit 346 × 260 px greyscale png file. The filename of the frame contains the following information: in what order the frames were captured, where the subject was looking, and when the frame was captured in us. The filename has the format "Index_Row_Column_Stimulus_Timestamp.png", where:

The monitor on which the stimulus was displayed was a Sceptre 1080p X415BV_FSR. This model has a 40 inch diagonal, 1920x1080 pixel resolution, and was placed 40 cm away from the user, with the user's eyes roughly centered on the screen.

Cite

Event Based, Near Eye Gaze Tracking Beyond 10,000Hz:

@article{angelopoulos2020event,
  title={Event Based, Near Eye Gaze Tracking Beyond 10,000 Hz},
  author={Angelopoulos, Anastasios N and Martel, Julien NP and Kohli, Amit PS and Conradt, Jorg and Wetzstein, Gordon},
  journal={arXiv preprint arXiv:2004.03577},
  year={2020}
}