Colloquium

Eye-tracking in Virtual Reality: Calibration and validation for the Interpolated Velocity Threshold Algorithm

Organised by Laboratory of Geo-information Science and Remote Sensing
Date

Wed 17 April 2024 10:30 to 11:00

Venue Gaia, building number 101
Droevendaalsesteeg 3
101
6708 PB Wageningen
+31 (0) 317 - 48 17 00
Room 2

By Ate Jepma

Abstract
Movements of the human eyes correlate with many aspects of our cognition, attention and spatial awareness. How the eyes relate to these aspects have been puzzling researchers for decades. To capture, understand and demarcate this, eye-movement classification like identifying fixations has been a topic of extensive scrutiny in two-dimensional environments (e.g., a desktop screen with mobile eye-tracker). However, with recent developments in Virtual Reality (VR), high-frequency eye-tracking for fixation identification has become possible in immersive virtual environments (IVEs). The parameters used in various eye-fixation algorithms greatly affect the results of fixation identification. As such, systematically validating and optimizing parameters for a fixation identification algorithm becomes vital to determining the accuracy of fixation identification. In two-dimensional environments this principle has been extensively researched, however systematic parameter validation in IVEs is often missing. This thesis research aims to provide insight into methodologies and experimental setups for systematic parameter tuning and validation for future eye-tracking algorithm calibrations. The algorithm used in this study is the Interpolated Velocity Threshold (I-VT) algorithm. This velocity-based algorithm functions by assigning successive gaze points to a fixation where the angular velocity between points is lower than the (velocity) threshold. As such, ‘slower’ eye-movements are grouped together and classified under a fixation. This work provides validated velocity threshold parameters for the I-VT algorithm to be applied in future research into eye-tracking across virtual distance with a head-mounted display. To achieve this, a VR experiment with sample size N=14, is set up to include depth intervals and head-rotation, as the unique characteristics of experiencing an IVE, in determining the optimal fixation identification parameters across depth and velocity thresholds. Two loss functions are created to accommodate the aspects of average fixation counts relative to targets and the number of hits within on-target fixations relative to non-target hits. Finally, several methods are proposed to select optimal parameter ranges. A K-fold cross validation of selected optimum parameters indicates that the optimum average velocity threshold is 25.68°/s. Across 11 depth intervals of 1 meter distance, the range of optimum values is between 20 – 35°/s. No clear optimum parameter pattern was detected for increasing distance. The identified optimum velocity threshold values align with prior research into two-dimensional setups. Notwithstanding, the effect of varying size on optimal velocity threshold values has not been considered in this research. Future studies could explore this to derive optimal values independent of size and depth for the I-VT algorithm in immersive virtual environments.