Skip to main content
Log in

A novel system for the automatic reconstruction of visual field based on eye tracking and machine learning

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Eye movement perimetry (EMP) is a paradigm developed to assess the visual field without the necessity of suppressing the natural eye movements during the test. Unlike the standard automated perimetry (SAP) where the patient’s responses are recorded using a button, EMP uses the natural eye movements reflex as responses during the evaluation. The reliability of EMP depends on correctly determining whether a stimulus is seen or not which, in turn, depends on an adequate analysis of the eye movement data. However, many studies in EMP have focused on characterizing eye movements and only a few authors have documented their methods to determine whether a peripheral stimulus was seen during the test. Furthermore, many of them use static thresholds to perform the classification, but it is not clear how these threshold values were obtained. Based on the foregoing, we develop a threshold test based on FASTPAC C24-2 and EMP for the visual field assessment. Our method uses two machine learning techniques: (1) cascaded K-Means and Bayesian classifiers (KBC) and (2) an Artificial Neural Network (ANN) to classify whether a stimulus was seen or not. Our method was validated with twenty healthy participants (13 women and 7 men) aged 19–43 years (µ = 26 ± 5 years), where the participants performed both an EMP test and an SAP emulation test. Results were compared with gaze trajectories annotations performed by an expert, obtaining accuracy values between 96.8% and 98.9% for KBC and ANN, and values between 90.5% and 92% for SAP emulation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15

Similar content being viewed by others

Data Availability

Not applicable.

Code Availability

No applicable.

References

  1. Anders H, Patella VM, Chong LX et al (2019) A New SITA perimetric threshold testing algorithm: construction and a multicenter clinical study. Am J Ophthalmol 198:154–165. https://doi.org/10.1016/j.ajo.2018.10.010

  2. Bishop CM (2006) Pattern recognition and machine learning. Springer- Verlag, Berlin

  3. Demirel S, Vingrys AJ (1994) Eye movements during perimetry and the effect that fixational instability has on perimetric outcomes. J Glaucoma 3(1):28–35 PMID: 19920549

  4. Duda RO, Hart PE, Stork DG (2000) Pattern classification, 2nd edn. Wiley-Interscience, New York

    MATH  Google Scholar 

  5. Jernigan ME (1980) Structural analysis of eye movement responses to visual field stimuli. Comput Biol Med 10:11–22. https://doi.org/10.1016/0010-4825(80)90003-7

    Article  Google Scholar 

  6. Johnson CA, Adams CW, Lewis RA (1988) Fatigue effects in automated perimetry. Appl Opt 27(6):1030–1037. https://doi.org/10.1364/AO.27.001030

    Article  Google Scholar 

  7. Jones PR (2020) An open-source static threshold perimetry test using remote eye-tracking (eyecatcher): description, validation and preliminary normative data. Trans Vis Sci Tech 9(8):18. https://doi.org/10.1167/tvst.9.8.18

    Article  Google Scholar 

  8. Junoy Montolio FG, Wesselink C, Gordijn M, Jansonius NM (2012) Factors that influence standard automated perimetry test results in glaucoma: test reliability, technician experience, time of day, and season. Invest Ophthalmol Vis Sci 53(11):7010–7017. https://doi.org/10.1167/iovs.12-10268

  9. Kim DE, Eizenman M, Trope GE, Kranemann C (1995) Eye movement perimetry. In: Engineering in Medicine and Biology Society, Montreal, Quebec, Canada. IEEE, pp 629–1630

  10. Kosnik W, Fikre J, Sekuler R (1986) Visual fixation stability in older adults. Invest Ophthalmol Vis Sci 27(12):1720–1725

    Google Scholar 

  11. Martínez-González EA, Alba A, Méndez MO et al (2020) Developing a visual perimetry test based on eye-tracking: proof of concept. Health Technol 10:437–441. https://doi.org/10.1007/s12553-019-00366-9

    Article  Google Scholar 

  12. Mazumdar D, Kadavath Meethal NS, Panday M, Asokan R, Thepass G, George RJ, van der Steen J, Pel JJM (2019) Effect of age, sex, stimulus intensity, and eccentricity on saccadic reaction time in eye movement perimetry. Trans Vis Sci Tech 8(4):13. https://doi.org/10.1167/tvst.8.4.13

    Article  Google Scholar 

  13. McTrusty AD, Cameron LA, Perperidis A et al (2017) Comparison of threshold Saccadic Vector Optokinetic Perimetry (SVOP) and standard automated perimetry (SAP) in Glaucoma. Part II: patterns of Visual Field loss and acceptability. Trans Vis Sci Tech 6(5):4. https://doi.org/10.1167/tvst.6.5.4

    Article  Google Scholar 

  14. Murray IC, Fleck BW, Brash HM, MacRae ME, Tan LL, Minns RA (2009) Feasibility of saccadic vector optokinetic perimetry: a method of automated static perimetry for children using eye tracking. Ophthalmology 116:2017–2026

    Article  Google Scholar 

  15. Murray IC, Cameron LA, McTrusty AD et al (2016) Feasibility, accuracy, and repeatability of suprathreshold saccadic vector optokinetic perimetry. Trans Vis Sci Tech 5(4):15. https://doi.org/10.1167/tvst.5.4.15

  16. Murray IC, Perperidis A, Cameron LA et al (2017) Comparison of saccadic vector optokinetic perimetry and standard automated perimetry in glaucoma. Part I: threshold values and repeatability. Trans Vis Sci Tech 6(5):3. https://doi.org/10.1167/tvst.6.5.3

  17. Pel JJM, van Beijsterveld MCM, Thepass G, van der Steen J (2013) Validity and repeatability of saccadic response times across the visual field in eye movement perimetry. Trans Vis Sci Tech 2(7):3. https://doi.org/10.1167/tvst.2.7.3

    Article  Google Scholar 

  18. Pete R, Jones ND, Smith W, Bi DP (2019) Crabb; portable perimetry using eye-tracking on a tablet computer—a feasibility assessment. Trans Vis Sci Tech 8(1):17. https://doi.org/10.1167/tvst.8.1.17

Download references

Acknowledgements

E.A. Martínez-González acknowledges CONACYT- México for the scholarship 712805.

Author information

Authors and Affiliations

Authors

Contributions

Not applicable.

Corresponding author

Correspondence to Martin O. Mendez.

Ethics declarations

Ethics approval

The study presented in this article was conducted following the ethical standards of the institution (Universidad Autónoma de San Luis Potosí) and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.

Consent to participate

Signed informed consent was obtained from all the participants included in the study.

Consent for publication

Maintaining the privacy of the participants and the data confidentiality, all the participants gave their consent to use their data acquired during the experiment in this research.

Conflicts of interest/Competing interests

The authors have no conflicts of interest to declare that are relevant to the content of this article.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Martínez-González, E.A., Alba, A., Arce-Santana, E. et al. A novel system for the automatic reconstruction of visual field based on eye tracking and machine learning. Multimed Tools Appl 82, 27193–27215 (2023). https://doi.org/10.1007/s11042-023-14464-4

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-023-14464-4

Keywords

Navigation