Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Real-time gaze estimation via pupil center tracking

Real-time gaze estimation via pupil center tracking AbstractAutomatic gaze estimation not based on commercial and expensive eye tracking hardware solutions can enable several applications in the fields of human computer interaction (HCI) and human behavior analysis. It is therefore not surprising that several related techniques and methods have been investigated in recent years. However, very few camera-based systems proposed in the literature are both real-time and robust. In this work, we propose a real-time user-calibration-free gaze estimation system that does not need person-dependent calibration, can deal with illumination changes and head pose variations, and can work with a wide range of distances from the camera. Our solution is based on a 3-D appearance-based method that processes the images from a built-in laptop camera. Real-time performance is obtained by combining head pose information with geometrical eye features to train a machine learning algorithm. Our method has been validated on a data set of images of users in natural environments, and shows promising results. The possibility of a real-time implementation, combined with the good quality of gaze tracking, make this system suitable for various HCI applications. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Paladyn, Journal of Behavioral Robotics de Gruyter

Real-time gaze estimation via pupil center tracking

Loading next page...
1
 
/lp/degruyter/real-time-gaze-estimation-via-pupil-center-tracking-9UZ5asm2Tj

References (118)

Publisher
de Gruyter
Copyright
© 2018 Dario Cazzato et al
ISSN
2081-4836
eISSN
2081-4836
DOI
10.1515/pjbr-2018-0002
Publisher site
See Article on Publisher Site

Abstract

AbstractAutomatic gaze estimation not based on commercial and expensive eye tracking hardware solutions can enable several applications in the fields of human computer interaction (HCI) and human behavior analysis. It is therefore not surprising that several related techniques and methods have been investigated in recent years. However, very few camera-based systems proposed in the literature are both real-time and robust. In this work, we propose a real-time user-calibration-free gaze estimation system that does not need person-dependent calibration, can deal with illumination changes and head pose variations, and can work with a wide range of distances from the camera. Our solution is based on a 3-D appearance-based method that processes the images from a built-in laptop camera. Real-time performance is obtained by combining head pose information with geometrical eye features to train a machine learning algorithm. Our method has been validated on a data set of images of users in natural environments, and shows promising results. The possibility of a real-time implementation, combined with the good quality of gaze tracking, make this system suitable for various HCI applications.

Journal

Paladyn, Journal of Behavioral Roboticsde Gruyter

Published: Feb 7, 2018

There are no references for this article.