Institutional Repository
Technical University of Crete
EN  |  EL

Search

Browse

My Space

Gaze prediction using machine learning for dynamic stereo manipulation in games

Koulieris Georgios-Alexandros, Drettakis, George, 1965-, Cunningham Douglas W., Mania Aikaterini

Full record


URI: http://purl.tuc.gr/dl/dias/06A71739-08BE-46A4-99F7-C41C6C1639B9
Year 2016
Type of Item Conference Full Paper
License
Details
Bibliographic Citation G. A. Koulieris, G. Drettakis, D. Cunningham and K. Mania, "Gaze prediction using machine learning for dynamic stereo manipulation in games," in 18th IEEE Virtual Reality Conference, 2016, pp. 113-120. doi: 10.1109/VR.2016.7504694 https://doi.org/10.1109/VR.2016.7504694
Appears in Collections

Summary

Comfortable, high-quality 3D stereo viewing is becoming a requirement for interactive applications today. Previous research shows that manipulating disparity can alleviate some of the discomfort caused by 3D stereo, but it is best to do this locally, around the object the user is gazing at. The main challenge is thus to develop a gaze predictor in the demanding context of real-time, heavily task-oriented applications such as games. Our key observation is that player actions are highly correlated with the present state of a game, encoded by game variables. Based on this, we train a classifier to learn these correlations using an eye-tracker which provides the ground-truth object being looked at. The classifier is used at runtime to predict object category - and thus gaze - during game play, based on the current state of game variables. We use this prediction to propose a dynamic disparity manipulation method, which provides rich and comfortable depth. We evaluate the quality of our gaze predictor numerically and experimentally, showing that it predicts gaze more accurately than previous approaches. A subjective rating study demonstrates that our localized disparity manipulation is preferred over previous methods.

Services

Statistics