Skip to main content

Advertisement

Log in

A novel gaussian distribution and tukey weight (gdatw) algorithms: deformation accuracy for augmented reality (ar) in facelift surgery

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

In facelift surgeries, tracking the exact position of the patient’s facial nerves, blood vessels and other tissues can help the surgeons make the right decisions during surgery. Using Augmented Reality technologies that display the mapping of soft tissues beneath the skin, surgeons will be able to detect and locate the regions of cutting correctly prior to making the first cut. The study of Augmented Reality-assisted surgeries has not been found in facelift surgeries involving facial soft tissues, thus this journal can provide with invaluable first steps into the more concentrated studies involving this area. The current available systems in the oral and maxillofacial areas has shown limitation in supporting the elastic nature of facial soft tissues, which shape shifts and changes due to patient’s movement, or movement caused by surgeon during surgery. This paper aims to increase overlay accuracy by reducing elastic deformation error for Augmented Reality in facelift surgeries. The proposed system consists of a Gaussian Distribution and Tukey Weight (GDaTW) algorithm to reduce the deformation error after the geometric error algorithm had been performed. The test results confirm that the new algorithm improves the video accuracy by ~0.10 mm by reducing the overlay error caused by elastic deformation with the display framerate of 5–10 frames per second compared to 10–13 frames per second in existing system. The improvement proposed in this system increases the overlay accuracy by reducing elastic deformation error. The correct display of soft tissues in the mandibular region using Augmented Reality aims to aid plastic surgeons perform facelift surgeries with confidence, to avoid making the wrong cuts of vital areas occluded by the patient’s skin.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Ai D, Yang J, Fan J, Zhao Y, Song X, Shen J, … Wang Y (2016) Augmented reality based real-time subcutaneous vein imaging system. Biomed Opt Express 7(7):2565–2585

    Article  Google Scholar 

  2. Bernhardt S, Nicolau SA, Soler L, Doignon C (2017) The status of augmented reality in laparoscopic surgery as of 2016. Med Image Anal 37:66–90

    Article  Google Scholar 

  3. Chen Y, Medioni G (1992) Object modelling by registration of multiple range images. Image Vis Comput 10(3):145–155

    Article  Google Scholar 

  4. Chu Y, Yang J, Ma S, Ai D, Li W, Song H, … Wang Y (2017) Registration and fusion quantification of augmented reality based nasal endoscopic surgery. Med Image Anal 42:241–256

    Article  Google Scholar 

  5. Gold S, Lu CP, Rangarajan A, Pappu S, Mjolsness E (1995) New algorithms for 2D and 3D point matching: Pose estimation and correspondence. Adv Neural Inf Proces Syst 957–964

  6. Haouchine N, Cotin S, Peterlik I, Dequidt J, Lopez MS, Kerrien E, Berger MO (2015) Impact of soft tissue heterogeneity on augmented reality for liver surgery. IEEE Trans Vis Comput Graph 21(5):584–597

    Article  Google Scholar 

  7. Hayashi Y, Misawa K, Hawkes DJ, Mori K (2016) Progressive internal landmark registration for surgical navigation in laparoscopic gastrectomy for gastric cancer. Int J Comput Assist Radiol Surg 11(5):837–845

    Article  Google Scholar 

  8. JBA M, MA V (1998) A survey of medical image registration. Med Image Anal 2(1):1–36

    Article  Google Scholar 

  9. Kalal Z, Mikolajczyk K, Matas J (2012) Tracking-learning-detection. IEEE Trans Pattern Anal Mach Intell 34(7):1409–1422

    Article  Google Scholar 

  10. Kersten-Oertel M, Gerard I, Drouin S, Mok K, Sirhan D, Sinclair DS, Collins DL (2015) Augmented reality in neurovascular surgery: feasibility and first uses in the operating room. Int J Comput Assist Radiol Surg 10(11):1823–1836

    Article  Google Scholar 

  11. Kong S, Haouchine N, Soares R, Klymchenko A, Andreiuk B, Marques B, … Marescaux J (2017) Robust augmented reality registration method for localization of solid organs’ tumors using CT-derived virtual biomechanical model and fluorescent fiducials. Surg Endosc 31(7):2863–2871

    Article  Google Scholar 

  12. Lu P, Barazzetti L, Chandran V, Gavaghan K, Weber S, Gerber N, Reyes M (2018). Highly accurate Facial Nerve Segmentation Refinement from CBCT/CT Imaging using a Super Resolution Classification Approach. IEEE Trans Biomed Eng 65(1):178–188. https://doi.org/10.1109/TBME.2017.2697916

  13. Murugesan Y, Alsadoon A, Manoranjan P, Prasad P (2018) A novel rotational matrix and translation vector algorithm: Geometric accuracy for augmented reality in oral and maxillofacial surgeries. International Journal of Medical Robotics and Computer Assisted Surgery 14(2). https://doi.org/10.1002/rcs.1889

  14. Ng S, Alsadoon A, Prasad P, Deva A, Manoranjanl P, Elchouemi A, Lau S-H (2018) Reducing Deformation - Augmented Reality (AR) in Facelift Surgery: A Theoretical and Mathematical Study. 7th International Conference on Computer and Communication Engineering (ICCCE), 2018. https://doi.org/10.1109/ICCCE.2018.8539344

  15. Nicolau S, Soler L, Mutter D, Marescaux J (2011) Augmented reality in laparoscopic surgical oncology. Surg Oncol 20(3):189–201

    Article  Google Scholar 

  16. Nosrati MS, Amir-Khalili A, Peyrat JM, Abinahed J, Al-Alao O, Al-Ansari A, … Hamarneh G (2016) Endoscopic scene labelling and augmentation using intraoperative pulsatile motion and colour appearance cues with preoperative anatomical priors. Int J Comput Assist Radiol Surg 11(8):1409–1418

    Article  Google Scholar 

  17. Puerto-Souza GA, Cadeddu JA, Mariottini G (2014) Toward Long-Term and Accurate Augmented-Reality for Monocular Endoscopic Videos. IEEE Trans Biomed Eng 61(10):2609–2620. https://doi.org/10.1109/IM.2001.924423.

  18. Ren S, Jain DK, Guo K, Xu T, Chi T (2019) Towards Efficient Medical Lesion Image Super-Resolution based on Deep Residual Networks[J]. Signal Process Image Commun 75:1–10. https://doi.org/10.1109/ICCVW.2009.5457428

  19. Rusinkiewicz S, Levoy M (2001). Efficient variants of the ICP algorithm. In 3-D Digital Imaging and Modeling, 2001. Third International Conference on IEEE (pp. 145–152). IEEE

  20. Sagawa R, Akasaka K, Yagi Y, Hamer H, van Gool L (2009). Elastic convolved ICP for the registration of deformable objects. In Computer Vision Workshops (ICCV Workshops), 2009 IEEE 12th International Conference on IEEE (pp. 1558–1565). IEEE

  21. Ulrich M, Wiedemann C, Steger C (2012) Combining scale-space and similarity-based aspect graphs for fast 3d object recognition. IEEE Trans Pattern Anal Mach Intell 34(10):1902–1914

    Article  Google Scholar 

  22. Wang R, Geng Z, Zhang Z, Pei R, Meng X (2017a) Autostereoscopic augmented reality visualization for depth perception in endoscopic surgery. IEEE Journal of Biomedical and Health Informatics 22(5):1540–1551. https://doi.org/10.1109/JBHI.2017.2770214

  23. Wang J, Suenaga H, Yang L, Kobayashi E, Sakuma I (2016) Video see-through augmented reality for oral and maxillofacial surgery. Int J Med Rob Comput Assisted Surg 13:e1754. https://doi.org/10.1002/rcs.1754

    Article  Google Scholar 

  24. Wang R, Zhang M, Meng X, Geng Z, Wang FY (2017b) 3D Tracking for Augmented Reality Using Combined Region and Dense Cues in Endoscopic Surgery. IEEE J Biomed Health Inform, 12–25

Download references

Acknowledgements

This study was supported in part by Study Support Manager Angelika Maag and CSU student Shelsa Ng et al. [14], from the Charles Sturt University Study Centre, Sydney, Australia.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Abeer Alsadoon.

Ethics declarations

Conflict of interest

There is no conflict of interest.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix 1

Appendix 1

Table 5 Abbreviations for the terms used in this paper

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Alsadoon, A., Murugesan, Y., Prasad, P.W.C. et al. A novel gaussian distribution and tukey weight (gdatw) algorithms: deformation accuracy for augmented reality (ar) in facelift surgery. Multimed Tools Appl 80, 15719–15743 (2021). https://doi.org/10.1007/s11042-021-10590-z

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-021-10590-z

Keywords

Navigation