Abstract
Head-mounted, video-based eye tracking is becoming increasingly common and has promise in a range of applications. Here, we provide a practical and systematic assessment of the sources of measurement uncertainty for one such device – the Pupil Core – in three eye-tracking domains: (1) the 2D scene camera image; (2) the physical rotation of the eye relative to the scene camera 3D space; and (3) the external projection of the estimated gaze point location onto the target plane or in relation to world coordinates. We also assess eye camera motion during active tasks relative to the eye and the scene camera, an important consideration as the rigid arrangement of eye and scene camera is essential for proper alignment of the detected gaze. We find that eye camera motion, improper gaze point depth estimation, and erroneous eye models can all lead to added noise that must be considered in the experimental design. Further, while calibration accuracy and precision estimates can help assess data quality in the scene camera image, they may not be reflective of errors and variability in gaze point estimation. These findings support the importance of eye model constancy for comparisons across experimental conditions and suggest additional assessments of data reliability may be warranted for experiments that require the gaze point or measure eye movements relative to the external world.
Similar content being viewed by others
References
Binaee, K., Diaz, G., Pelz, J., & Phillips, F. (2016). Binocular eye tracking calibration during a virtual ball catching task using head-mounted display. Proceedings of the ACM Symposium on Applied Perception (pp. 15–18). https://doi.org/10.1145/2931002.2931020
Bulling, A., Kasneci, E., Lander, C., Santini, T., Brinkmann, H., Reitstätter, L., et al. (2018). The art of pervasive eye tracking: Unconstrained eye tracking in the Austrian Gallery Belvedere. Proceedings of the 7th workshop on pervasive eye tracking and mobile eye-based interaction (pp. 5). https://doi.org/10.1145/3208031.3208032
Cromwell, R. L., Pidcoe, P. E., Griffin, L. A., Sotillo, T., Ganninger, D., & Feagin, M. (2004). Adaptations in horizontal head stabilization in response to altered vision and gaze during natural walking. Journal of Vestibular Research: Equilibrium & Orientation,14(5), 367–73.
Dierkes, K., Kassner, M., & Bulling, A. (2018). A novel approach to single camera, glint-free 3D eye model fitting including corneal refraction (pp. 1–9). ETRA ’18. https://doi.org/10.1145/3204493.3204525
Dierkes, K., Kassner, M., & Bulling, A. (2019). A fast approach to refraction-aware eye-model fitting and gaze prediction. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications (pp. 1–9). Presented at the ETRA ’19. https://doi.org/10.1145/3314111.3319819
Ehinger, B. V., Groß, K., Ibs, I., & König, P. (2019). A new comprehensive eye-tracking test battery concurrently evaluating the Pupil Labs glasses and the EyeLink 1000. PeerJ,7(2), e7086-43. https://doi.org/10.7717/peerj.7086
Fuhl, W., Tonsen, M., Bulling, A., & Kasneci, E. (2016). Pupil detection for head-mounted eye tracking in the wild: An evaluation of the state of the art. Machine Vision and Applications,27(8), 1275–1288. https://doi.org/10.1007/s00138-016-0776-4
Gibaldi, A., DuTell, V., & Banks, M. S. (2021). solving parallax error for 3D eye tracking. ACM symposium on eye tracking research and applications (pp. 1–4). https://doi.org/10.1145/3450341.3458494
Hart, B. M. ’t, Vockeroth, J., Schumann, F., Bartl, K., Schneider, E., König, P., & Einhäuser, W. (2009). Gaze allocation in natural stimuli: Comparing free exploration to head-fixed viewing conditions. Visual Cognition,17(6–7), 1132–1158. https://doi.org/10.1080/13506280902812304
Hausamann, P., Sinnott, C., & MacNeilage, P. R. (2020). Positional head-eye tracking outside the lab: An open-source solution. Proceedings. Eye Tracking Research & Applications Symposium,2020, 1–5. https://doi.org/10.1145/3379156.3391365
Holmqvist, K., Örbom, S. L., Hooge, I. T. C., Niehorster, D. C., Alexander, R. G., Andersson, R., et al. (2022). Eye tracking: Empirical foundations for a minimal reporting guideline. Behavior Research Methods, 1–53. https://doi.org/10.3758/s13428-021-01762-8
Hooge, I. T. C., Hessels, R. S., & Nyström, M. (2019a). Do pupil-based binocular video eye trackers reliably measure vergence? Vision Research,156, 1–9. https://doi.org/10.1016/j.visres.2019.01.004
Hooge, I. T. C., Holleman, G. A., Haukes, N. C., & Hessels, R. S. (2019b). Gaze tracking accuracy in humans: One eye is sometimes better than two. Behavior Research Methods,51(6), 2712–2721. https://doi.org/10.3758/s13428-018-1135-3
Hooge, I. T. C., Niehorster, D. C., Hessels, R. S., Benjamins, J. S., & Nyström, M. (2022). How robust are wearable eye trackers to slow and fast head and body movements? Behavior Research Methods, 1–15. https://doi.org/10.3758/s13428-022-02010-3
Imatest. (2022). Projective Camera Model. Geometric Calibration. https://www.imatest.com/support/docs/pre-5-2/geometric-calibration-deprecated/projective-camera/. Accessed 28 Jan 2022
Lappi, O., Rinkkala, P., & Pekkanen, J. (2017). Systematic observation of an expert driver’s gaze strategy—an on-road case study. Frontiers in Psychology,8, 620. https://doi.org/10.3389/fpsyg.2017.00620
Li, F., Munn, S., & Pelz, J. (2008). A model-based approach to video-based eye tracking. Journal of Modern Optics,55(4–5), 503–531. https://doi.org/10.1080/09500340701467827
LP-RESEARCH. (2022). LPMS-CURS2: OEM Version 9-Axis Inertial Measurement Unit (IMU) / AHRS with USB, CAN Bus and UART Connectivity. https://lp-research.com/lpms-curs2/. Accessed 27 Jan 2022
Macinnes, J. J., Iqbal, S., Pearson, J., & Johnson, E. N. (2018). Wearable eye-tracking for research: Automated dynamic gaze mapping and accuracy/precision comparisons across devices. bioRxiv, 1–31. https://doi.org/10.1101/299925
Mansouryar, M., Steil, J., Sugano, Y., & Bulling, A. (2016). 3D Gaze Estimation from 2D pupil positions on monocular head-mounted eye trackers. arXiv, 197–200. https://doi.org/10.1145/2857491.2857530
Mardanbegi, D., & Hansen, D. W. (2012). Parallax error in the monocular head-mounted eye trackers. Proceedings of the 2012 ACM Conference on Ubiquitous Computing - UbiComp ’12 (pp. 689–694). https://doi.org/10.1145/2370216.2370366
Matthis, J. S., Yates, J. L., & Hayhoe, M. M. (2018). Gaze and the control of foot placement when walking in natural terrain. Current Biology: CB,28(8), 1224-1233.e5. https://doi.org/10.1016/j.cub.2018.03.008
Miles, W. R. (1930). Ocular dominance in human adults. The Journal of General Psychology,3(3), 412–430. https://doi.org/10.1080/00221309.1930.9918218
Minakata, K., & Beier, S. (2021). The effect of font width on eye movements during reading. Applied Ergonomics ,97, 103523. https://doi.org/10.1016/j.apergo.2021.103523
Mole, C., Pekkanen, J., Sheppard, W. E. A., Markkula, G., & Wilkie, R. M. (2021). Drivers use active gaze to monitor waypoints during automated driving. Scientific reports,11(1), 263. https://doi.org/10.1038/s41598-020-80126-2
Niehorster, D. C., Santini, T., Hessels, R. S., Hooge, I. T. C., Kasneci, E., & Nyström, M. (2020a). The impact of slippage on the data quality of head-worn eye trackers. Behavior Research Methods ,41(1), 204–21. https://doi.org/10.3758/s13428-019-01307-0
Niehorster, D. C., Zemblys, R., Beelders, T., & Holmqvist, K. (2020b). Characterizing gaze position signals and synthesizing noise during fixations in eye-tracking data. Behavior Research Methods ,52(6), 2515–2534. https://doi.org/10.3758/s13428-020-01400-9
Paige, G. D., Telford, L., Seidman, S. H., & Barnes, G. R. (1998). Human vestibuloocular reflex and its interactions with vision and fixation distance during linear and angular head movement. Journal of neurophysiology,80(5), 2391–2404. https://doi.org/10.1152/jn.00889.2016
Petersch, B., & Dierkes, K. (2021). Gaze-angle dependency of pupil-size measurements in head-mounted eye tracking. Behavior Research Methods, 1–17. https://doi.org/10.3758/s13428-021-01657-8
Pupil Labs. (2022). Pupil Labs: Pupil Core. https://pupil-labs.com/products/core. Accessed 27 Jan 2022
Santini, T., Niehorster, D. C., & Kasneci, E. (2019). Get a grip: Slippage-robust and glint-free gaze estimation for real-time pervasive head-mounted eye tracking. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications (pp. 1–10). Presented at the 11th ACM Symposium on Eye Tracking Research & Applications. https://doi.org/10.1145/3314111.3319835
Shanidze, N., & Velisar, A. (2020). Eye, head, and gaze contributions to smooth pursuit in macular degeneration. Journal of neurophysiology,124(1), 134–144. https://doi.org/10.1152/jn.00001.2020
Siegler, I., & Israël, I. (2002). The importance of head-free gaze control in humans performing a spatial orientation task. Neuroscience Letters,333(2), 99–102. https://doi.org/10.1016/s0304-3940(02)01028-5
Świrski, L. (2015). Gaze estimation on glasses-based stereoscopic displays. University of Cambridge.
Swirski, L., & Dodgson, N. (2013). A fully-automatic, temporal approach to single camera, glint-free 3D eye model fitting. In Proceedings on Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI). Presented at the 3rd International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI)
Triggs, B., McLauchlan, P. F., Hartley, R. I., & Fitzgibbon, A. W. (2000). Vision algorithms: Theory and practice. Lecture Notes in Computer Science (pp. 298–372). https://doi.org/10.1007/3-540-44480-7_21
Velisar, A., & Shanidze, N. (2021). Noise in the machine: Sources of physical and computation error in eye tracking with pupil core wearable eye tracker. 2021 ACM Symposium on Eye Tracking Research and Applications - Adjunct, 1–3.https://doi.org/10.1145/3450341.3458495
Viirre, E., Tweed, D., Milner, K., & Vilis, T. (1986). A reexamination of the gain of the vestibuloocular reflex. Journal of neurophysiology,56(2), 439–450. https://doi.org/10.1152/jn.1986.56.2.439
Walker, F., Bucker, B., Anderson, N. C., Schreij, D., & Theeuwes, J. (2017). Looking at paintings in the Vincent Van Gogh Museum: Eye movement patterns of children and adults. PLoS one,12(6), e0178912. https://doi.org/10.1371/journal.pone.0178912
Yu, L. H., & Eizenman, M. (2004). A new methodology for determining point-of-gaze in head-mounted eye tracking systems. IEEE Transactions on Biomedical Engineering,51(10), 1765–1773. https://doi.org/10.1109/tbme.2004.831523
Acknowledgements
The authors thank Drs Catherine P. Agathos and Kamran Binaee for their feedback and invaluable suggestions and Dr. James Coughlan for assistance with the implementation and interpretation of homography transformations. This work was supported by National Eye Institute Grants R00-EY-026994 and R01 AG073157 (to N. Shanidze) and the Smith-Kettlewell Eye Research Institute.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Open practices
The eye-tracking and IMU data and data analysis software used in this study are available at: https://bitbucket.org/eyehead/pupil-noise/src/master/. None of the experiments was preregistered.
Supplementary information
Below is the link to the electronic supplementary material.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Velisar, A., Shanidze, N.M. Noise estimation for head-mounted 3D binocular eye tracking using Pupil Core eye-tracking goggles. Behav Res 56, 53–79 (2024). https://doi.org/10.3758/s13428-023-02150-0
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.3758/s13428-023-02150-0