Fichiers

Résumé

Gaze movements play a crucial role in human-computer interaction (HCI) applications. Recently, gaze tracking systems with a wide variety of applications have attracted much interest by the industry as well as the scientific community. The state-of-the-art gaze trackers are mostly non-intrusive and report high estimation accuracies. However, they require complex setups such as camera and geometric calibration in addition to subject-specific calibration. In this paper, we introduce a multi-camera gaze estimation system which requires less effort for the users in terms of the system setup and calibration. The system is based on an adaptive fusion of multiple independent camera systems in which the gaze estimation relies on simple cross-ratio (CR) geometry. Experimental results conducted on real data show that the proposed system achieves a significant accuracy improvement, by around 25%, over the traditional CR-based single camera systems through the novel adaptive multi-camera fusion scheme. The real-time system achieves less than 0.9 degrees accuracy error with very few calibration data (5 points) under natural head movements, which is competitive with more complex systems. Hence, the proposed system enables fast and user-friendly gaze tracking with minimum user effort without sacrificing too much accuracy.

Détails

Actions

Aperçu