Hostname: page-component-848d4c4894-4hhp2 Total loading time: 0 Render date: 2024-05-02T00:43:43.742Z Has data issue: false hasContentIssue false

Theoretical error analysis of spotlight-based instrument localization for retinal surgery

Published online by Cambridge University Press:  26 January 2023

Mingchuan Zhou*
Affiliation:
College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou, China
Felix Hennerkes
Affiliation:
Chair for Computer Aided Medical Procedures and Augmented Reality, Computer Science Department, Technische Universität München, Munchen, Germany
Jingsong Liu
Affiliation:
Chair for Computer Aided Medical Procedures and Augmented Reality, Computer Science Department, Technische Universität München, Munchen, Germany
Zhongliang Jiang
Affiliation:
Chair for Computer Aided Medical Procedures and Augmented Reality, Computer Science Department, Technische Universität München, Munchen, Germany
Thomas Wendler
Affiliation:
Chair for Computer Aided Medical Procedures and Augmented Reality, Computer Science Department, Technische Universität München, Munchen, Germany
M. Ali Nasseri
Affiliation:
Augenklinik und Poliklinik, Klinikum rechts der Isar der Technische Universität München, München, Germany
Iulian Iordachita
Affiliation:
Department of Mechanical Engineering and Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, USA
Nassir Navab
Affiliation:
Chair for Computer Aided Medical Procedures and Augmented Reality, Computer Science Department, Technische Universität München, Munchen, Germany
*
*Corresponding author. E-mail: mczhou@zju.edu.cn

Abstract

Retinal surgery is widely considered to be a complicated and challenging task even for specialists. Image-guided robot-assisted intervention is among the novel and promising solutions that may enhance human capabilities therein. In this paper, we demonstrate the possibility of using spotlights for 5D guidance of a microsurgical instrument. The theoretical basis of the localization for the instrument based on the projection of a single spotlight is analyzed to deduce the position and orientation of the spotlight source. The usage of multiple spotlights is also proposed to check the possibility of further improvements for the performance boundaries. The proposed method is verified within a high-fidelity simulation environment using the 3D creation suite Blender. Experimental results show that the average positioning error is 0.029 mm using a single spotlight and 0.025 mm with three spotlights, respectively, while the rotational errors are 0.124 and 0.101, which shows the application to be promising in instrument localization for retinal surgery.

Type
Research Article
Copyright
© The Author(s), 2023. Published by Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

WHO, Towards Universal Eye Health: A Global Action Plan 2014 to 2019 (WHO, Geneva, 2013).Google Scholar
Gijbels, A., Poorten, E. V., Gorissen, B., Devreker, A., Stalmans, P. and Reynaerts, D., “Experimental Validation of a Robotic Comanipulation and Telemanipulation System for Retinal Surgery,” In: 2014 5th IEEE RAS EMBS Int. Conf. Biomed. Robot. Biomechatronics (IEEE, 2014) pp. 144150.Google Scholar
Wei, W., Goldman, R., Simaan, N., Fine, H. and Chang, S., “Design and Theoretical Evaluation of Micro-Surgical Manipulators for Orbital Manipulation and Intraocular Dexterity,” In: Robot. Autom. 2007 IEEE Int. Conf. (IEEE, 2007) pp. 33893395.Google Scholar
Taylor, R., Jensen, P., Whitcomb, L., Barnes, A., Kumar, R., Stoianovici, D., Gupta, P., Wang, Z., Dejuan, E. and Kavoussi, L., “A steady-hand robotic system for microsurgical augmentation,” Int. J. Robot. Res. 18(12), 12011210 (1999).CrossRefGoogle Scholar
Ullrich, F., Bergeles, C., Pokki, J., Ergeneman, O., Erni, S., Chatzipirpiridis, G., Pané, S., Framme, C. and Nelson, B. J., “Mobility experiments with microrobots for minimally invasive intraocular Surgery: Microrobot experiments for intraocular surgery,” Invest. Ophthalmol. Vis. Sci. 54(4), 28532863 (2013).CrossRefGoogle ScholarPubMed
Rahimy, E., Wilson, J., Tsao, T. C., Schwartz, S. and Hubschman, J. P., “Robot-assisted intraocular surgery: Development of the IRISS and feasibility studies in an animal model,” Eye 27(8), 972978 (2013).CrossRefGoogle ScholarPubMed
Li, Z., Xu, C., Wei, Q., Shi, C. and Su, C.-Y., “Human-inspired control of dual-arm exoskeleton robots with force and impedance adaptation,” IEEE Trans. Systems Man Cybern. Syst. 50(12), 52965305 (2018).CrossRefGoogle Scholar
Wu, X. and Li, Z., “Cooperative manipulation of wearable dual-arm exoskeletons using force communication between partners,” IEEE Trans. Ind. Electron. 67(8), 66296638 (2019).CrossRefGoogle Scholar
Li, Z., Deng, C. and Zhao, K., “Human-cooperative control of a wearable walking exoskeleton for enhancing climbing stair activities,” IEEE Trans. Ind. Electron. 67(4), 30863095 (2019).CrossRefGoogle Scholar
Carbone, G. and Ceccarelli, M., “A serial-parallel robotic architecture for surgical tasks,” Robotica 23(3), 345354 (2005).CrossRefGoogle Scholar
Wang, H., Wang, S., Ding, J. and Luo, H., “Suturing and tying knots assisted by a surgical robot system in laryngeal mis,” Robotica 28(2), 241252 (2010).CrossRefGoogle Scholar
Qi, W. and Aliverti, A., “A multimodal wearable system for continuous and real-time breathing pattern monitoring during daily activity,” IEEE J. Biomed. Health 24(8), 21992207 (2019).CrossRefGoogle ScholarPubMed
Qi, W. and Su, H., “A cybertwin based multimodal network for ecg patterns monitoring using deep learning,” IEEE Trans. Ind. Inform. 18(10), 66636670 (2022).CrossRefGoogle Scholar
Qi, W., Ovur, S. E., Li, Z., Marzullo, A. and Song, R., “Multi-sensor guided hand gesture recognition for a teleoperated robot using a recurrent neural network,” IEEE Robot. Autom. Lett. 6(3), 60396045 (2021).CrossRefGoogle Scholar
Su, H., Qi, W., Schmirander, Y., Ovur, S. E., Cai, S. and Xiong, X., “A human activity-aware shared control solution for medical human–robot interaction,” Assembly Autom. 42(3), 388394 (2022).CrossRefGoogle Scholar
Edwards, T., Xue, K., Meenink, H., Beelen, M., Naus, G., Simunovic, M., Latasiewicz, M., Farmery, A., de Smet, M. and MacLaren, R., “First-in-human study of the safety and viability of intraocular robotic surgery,” Nat. Biomed. Eng. 2(9), 1656 (2018).CrossRefGoogle ScholarPubMed
Heiserman, D. L., Build Your Own Working Robot (G/L Tab Books, Blue Ridge Summit, PA, 1976).Google Scholar
Yurtsever, E., Lambert, J., Carballo, A. and Takeda, K., “A survey of autonomous driving: Common practices and emerging technologies,” IEEE Access 8, 5844358469 (2020).CrossRefGoogle Scholar
Yang, G.-Z., Cambias, J., Cleary, K., Daimler, E., Drake, J., Dupont, P. E., Hata, N., Kazanzides, P., Martel, S., Patel, R. V., Santos, V. J. and Taylor, R. H., “Medical robotics-regulatory, ethical, and legal considerations for increasing levels of autonomy,” Sci. Robot. 2(4), 8638 (2017).CrossRefGoogle ScholarPubMed
Lu, B., Chu, H. K., Huang, K. and Cheng, L., “Vision-based surgical suture looping through trajectory planning for wound suturing,” IEEE Trans. Autom. Sci. Eng. 16(2), 542556 (2018).CrossRefGoogle Scholar
Li, Z., Zhao, K., Zhang, L., Wu, X., Zhang, T., Li, Q., Li, X. and Su, C.-Y., “Human-in-the-loop control of a wearable lower limb exoskeleton for stable dynamic walking,” IEEE/ASME Trans. Mechatron. 26(5), 27002711 (2020).Google Scholar
Shi, Y., Cai, M., Xu, W. and Wang, Y., “Methods to evaluate and measure power of pneumatic system and their applications,” Chin. J. Mech. Eng. 32(42), 111 (2019).Google Scholar
Shi, Y., Chang, J., Wang, Y., Zhao, X., Zhang, Q. and Yang, L., “Gas leakage detection and pressure difference identification by asymmetric differential pressure method,” Chin. J. Mech. Eng. 35(44), 19 (2022).Google Scholar
Zhou, M., Yu, Q., Huang, K., Mahov, S., Eslami, A., Maier, M., Lohmann, C. P., Navab, N., Zapp, D., Knoll, A. and Nasseri, M. A., “Towards robotic-assisted subretinal injection: A hybrid parallel-serial robot system design and preliminary evaluation,” IEEE Trans. Ind. Electron. 67(8), 66176628 (2020).CrossRefGoogle Scholar
Zhou, M., Wu, J., Ebrahimi, A., Patel, N., He, C., Gehlbach, P., Taylor, R. H., Knoll, A., Nasseri, M. A. and Iordachita, I. I., “Spotlight-Based 3D Instrument Guidance for Retinal Surgery,” In: 2020 International Symposium on Medical Robotics (ISMR) (May 2020).Google Scholar
Probst, T., Maninis, K.-K., Chhatkuli, A., Ourak, M., Poorten, E. V. and Van Gool, L., “Automatic tool landmark detection for stereo vision in robot-assisted retinal surgery,” IEEE Robot. Autom. Lett. 3(1), 612619 (2018).Google Scholar
Yang, S., Martel, J. N., Lobes, L. A. Jr. and Riviere, C. N., “Techniques for robot-aided intraocular surgery using monocular vision,” Int. J. Robot. Res. 37(8), 931952 (2018).CrossRefGoogle ScholarPubMed
B. Foundation, Blender [Online] (2020). Available: https://www.blender.org/ Google Scholar
Roodaki, H., Grimm, M., Navab, N. and Eslami, A., “Real-time scene understanding in ophthalmic anterior segment oct images,” Invest. Ophthalmol. Vis. Sci. 60(11), PB095 (2019).Google Scholar
Zhou, M., Roodaki, H., Eslami, A., Chen, G., Huang, K., Maier, M., Lohmann, C. P., Knoll, A. and Nasseri, M. A., “Needle segmentation in volumetric optical coherence tomography images for ophthalmic microsurgery,” Appl. Sci. 7(8), 748 (2017).CrossRefGoogle Scholar
Zhou, M., Hamad, M., Weiss, J., Eslami, A., Huang, K., Maier, M., Lohmann, C. P., Navab, N., Knoll, A. and Nasseri, M. A., “Towards robotic eye surgery: Marker-free, online hand-eye calibration using optical coherence tomography images,” IEEE Robot. Autom. Lett. 3(4), 39443951 (2018).CrossRefGoogle Scholar
Zhou, M., Yu, Q., Mahov, S., Huang, K., Eslami, A., Maier, M., Lohmann, C. P., Navab, N., Zapp, D., Knoll, A. and Ali Nasseri, M., “Towards robotic-assisted subretinal injection: A hybrid parallel-serial robot system design and preliminary evaluation,” IEEE Trans. Ind. Electron. 67(8), 66176628 (2019).CrossRefGoogle Scholar
Weiss, J., Rieke, N., Nasseri, M. A., Maier, M., Eslami, A. and Navab, N., “Fast 5dof needle tracking in iOCT,” Int. J. Comput. Assist. Radiol. Surg. 13(6), 787796 (2018).CrossRefGoogle ScholarPubMed
Seider, M. I., Carrasco-Zevallos, O. M., Gunther, R., Viehland, C., Keller, B., Shen, L., Hahn, P., Mahmoud, T. H., Dandridge, A., Izatt, J. A. and Toth, C. A., “Real-time volumetric imaging of vitreoretinal surgery with a prototype microscope-integrated swept-source oct device,” Ophthalmol. Retina 2(5), 401410 (2018).Google ScholarPubMed
Chen, Q., Wu, H. and Wada, T., “Camera Calibration with Two Arbitrary Coplanar Circles,” In: European Conference on Computer Vision (Springer, Berlin/Heidelberg, 2004) pp. 521532.Google Scholar
Noo, F., Clackdoyle, R., Mennessier, C., White, T. A. and Roney, T. J., “Analytic method based on identification of ellipse parameters for scanner calibration in cone-beam tomography,” Phys. Med. Biol. 45(11), 34893508 (2000).Google ScholarPubMed
Swirski, L. and Dodgson, N., “A Fully-Automatic, Temporal Approach to Single Camera, Glint-Free 3D Eye Model Fitting,” In: Proc. PETMEI (2013) pp. 111.Google Scholar
Yang, S., MacLachlan, R. A., Martel, J. N., Lobes, L. A. and Riviere, C. N., “Comparative evaluation of handheld robot-aided intraocular laser surgery,” IEEE Trans. Robot. 32(1), 246251 (2016).CrossRefGoogle ScholarPubMed
Otsu, N., “A threshold selection method from gray-level histograms,” IEEE Trans. Syst. Man Cybern. 9(1), 344349 (1979).CrossRefGoogle Scholar
Shin, J., Chu, Y., Hong, Y., Kwon, O. and Byeon, S., “Determination of macular hole size in relation to individual variabilities of fovea morphology,” Eye 29(8), 10511059 (2015).CrossRefGoogle ScholarPubMed
Su, H., Hu, Y., Karimi, H. R., Knoll, A., Ferrigno, G. and De Momi, E., “Improved recurrent neural network-based manipulator control with remote center of motion constraints: Experimental results,” Neural Netw. 131, 291299 (2020).CrossRefGoogle ScholarPubMed