Skip to main content

Advertisement

Log in

Video and accelerometer-based motion analysis for automated surgical skills assessment

  • Original Article
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Purpose

Basic surgical skills of suturing and knot tying are an essential part of medical training. Having an automated system for surgical skills assessment could help save experts time and improve training efficiency. There have been some recent attempts at automated surgical skills assessment using either video analysis or acceleration data. In this paper, we present a novel approach for automated assessment of OSATS-like surgical skills and provide an analysis of different features on multi-modal data (video and accelerometer data).

Methods

We conduct a large study for basic surgical skill assessment on a dataset that contained video and accelerometer data for suturing and knot-tying tasks. We introduce “entropy-based” features—approximate entropy and cross-approximate entropy, which quantify the amount of predictability and regularity of fluctuations in time series data. The proposed features are compared to existing methods of Sequential Motion Texture, Discrete Cosine Transform and Discrete Fourier Transform, for surgical skills assessment.

Results

We report average performance of different features across all applicable OSATS-like criteria for suturing and knot-tying tasks. Our analysis shows that the proposed entropy-based features outperform previous state-of-the-art methods using video data, achieving average classification accuracies of 95.1 and 92.2% for suturing and knot tying, respectively. For accelerometer data, our method performs better for suturing achieving 86.8% average accuracy. We also show that fusion of video and acceleration features can improve overall performance for skill assessment.

Conclusion

Automated surgical skills assessment can be achieved with high accuracy using the proposed entropy features. Such a system can significantly improve the efficiency of surgical training in medical schools and teaching hospitals.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Notes

  1. https://axivity.com/downloads/wax9.

References

  1. Martin J, Regehr G, Reznick R, MacRae H, Murnaghan J, Hutchison C, Brown M (1997) Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg 84(2):273–278

    Article  CAS  PubMed  Google Scholar 

  2. Sharma Y, Bettadapura V, Plötz T, Hammerla N, Mellor S, McNaney R, Olivier P, Deshmukh S, McCaskie A, Essa I (2014) Video based assessment of OSATS using sequential motion textures. In: International workshop on modeling and monitoring of computer assisted interventions (M2CAI)—international conference on medical image computing and computer-assisted intervention—MICCAI

  3. Zia A, Sharma Y, Bettadapura V, Sarin EL, Ploetz T, Clements MA, Essa I (2016) Automated video-based assessment of surgical skills for training and evaluation in medical schools. Int J Comput Assist Radiol Surg 11(9):1623–1636

    Article  PubMed  Google Scholar 

  4. Bettadapura V, Schindler G, Plötz T, Essa I (2013) Augmenting bag-of-words: data-driven discovery of temporal and structural information for activity recognition. In: CVPR. IEEE

  5. Sharma Y, Plötz T, Hammerla N, Mellor S, Roisin M, Olivier P, Deshmukh S, McCaskie A, Essa I (2014) Automated surgical OSATS prediction from videos. In: ISBI. IEEE

  6. Zia A, Sharma Y, Bettadapura V, Sarin EL, Clements MA, Essa I (2015) Automated assessment of surgical skills using frequency analysis. In: International conference on medical image computing and computer-assisted intervention–MICCAI 2015. Springer, pp 430–438

  7. Trejos A, Patel R, Naish M, Schlachta C (2008) Design of a sensorized instrument for skills assessment and training in minimally invasive surgery. In: 2nd IEEE RAS & EMBS international conference on biomedical robotics and biomechatronics, 2008. BioRob 2008. IEEE, pp 965–970

  8. Nisky I, Che Y, Quek ZF, Weber M, Hsieh MH, Okamura AM (2015) Teleoperated versus open needle driving: kinematic analysis of experienced surgeons and novice users. In: 2015 IEEE international conference on robotics and automation (ICRA). IEEE, pp 5371–5377

  9. Ershad M, Koesters Z, Rege R, Majewicz A (2016) Meaningful assessment of surgical expertise: semantic labeling with data and crowds. In: International conference on medical image computing and computer-assisted intervention. Springer, pp 508–515

  10. Brown J, O’Brien C, Leung S, Dumon K, Lee D, Kuchenbecker K (2016) Using contact forces and robot arm accelerations to automatically rate surgeon skill at peg transfer. IEEE Trans Biomed Eng 64:2263–2275

    Article  PubMed  Google Scholar 

  11. Rosen J, Hannaford B, Richards CG, Sinanan MN (2001) Markov modeling of minimally invasive surgery based on tool/tissue interaction and force/torque signatures for evaluating surgical skills. IEEE Trans Biomed Eng 48(5):579–591

    Article  CAS  PubMed  Google Scholar 

  12. Reiley C, Hager G (2009) Decomposition of robotic surgical tasks: an analysis of subtasks and their correlation to skill. In: International conference on medical image computing and computer-assisted intervention–MICCAI

  13. Haro BB, Zappella L, Vidal R (2012) Surgical gesture classification from video data. In: International conference on medical image computing and computer-assisted intervention—MICCAI 2012. Springer, pp 34–41

  14. Zappella L, Béjar B, Hager G, Vidal R (2013) Surgical gesture classification from video and kinematic data. Med Image Anal 17(7):732–745

    Article  PubMed  Google Scholar 

  15. Twinanda AP, Shehata S, Mutter D, Marescaux J, de Mathelin M, Padoy N (2017) Endonet: A deep architecture for recognition tasks on laparoscopic videos. IEEE Trans Med Imaging 36(1):86–97

    Article  PubMed  Google Scholar 

  16. DiPietro R, Lea C, Malpani A, Ahmidi N, Vedula SS, Lee GI, Lee MR, Hager GD (2016) Recognizing surgical activities with recurrent neural networks. In: International conference on medical image computing and computer-assisted intervention. Springer, pp 551–558

  17. Krishnan S, Garg A, Patil S, Lea C, Hager G, Abbeel P, Goldberg K (2018) Transition state clustering: unsupervised surgical trajectory segmentation for robot learning. In: Robotics research. Springer, pp 91–110

  18. Zia A, Zhang C, Xiong X, Jarc AM (2017) Temporal clustering of surgical activities in robot-assisted surgery. Int J Comput Assist Radiol Surg 12(7):1171–1178

    Article  PubMed  PubMed Central  Google Scholar 

  19. Goh AC, Goldfarb DW, Sander JC, Miles BJ, Dunkin BJ (2012) Global evaluative assessment of robotic skills: validation of a clinical assessment tool to measure robotic surgical skills. J Urol 187(1):247–252

    Article  PubMed  Google Scholar 

  20. Pirsiavash H, Vondrick C, Torralba A (2014) Assessing the quality of actions. In: European conference on computer vision. Springer, pp 556–571

  21. Venkataraman V, Vlachos I, Turaga P (2015) Dynamical regularity for action analysis. In: Proceedings of the British machine vision conference (BMVC), pp 67–1

  22. Laptev I (2005) On space-time interest points. Int J Comput Vis 64(2–3):107–123

    Article  Google Scholar 

  23. Pudil P, Novovičová J, Kittler J (1994) Floating search methods in feature selection. Pattern Recognit Lett 15(11):1119–1125

    Article  Google Scholar 

  24. Pincus SM (1991) Approximate entropy as a measure of system complexity. Proc Natl Acad Sci 88(6):2297–2301

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  25. Pincus S, Singer BH (1996) Randomness and degrees of irregularity. Proc Natl Acad Sci 93(5):2083–2088

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  26. Sloetjes H, Wittenburg P (2008) Annotation by category: ELAN and ISO DCR. In: Language resources and evaluation conference—LREC

  27. McNemar Q (1947) Note on the sampling error of the difference between correlated proportions or percentages. Psychometrika 12(2):153–157

    Article  CAS  PubMed  Google Scholar 

  28. Martínez-Zarzuela M, Gómez C, Pernas FJD, Fernández A, Hornero R (2013) Cross-approximate entropy parallel computation on GPUs for biomedical signal analysis. Application to MEG recordings. Comput Methods Programs Biomed 112:189–199

    Article  PubMed  Google Scholar 

  29. Gao Y, Vedula SS, Reiley CE, Ahmidi N, Varadarajan B, Lin HC, Tao L, Zappella L, Béjar B, Yuh DD, Chen CCG, Vidal R, Khundanpur S, Hager GD (2014) JHU-ISI gesture and skill assessment working set (JIGSAWS): a surgical activity dataset for human motion modeling. In: International workshop on modeling and monitoring of computer assisted interventions (M2CAI)—international conference on medical image computing and computer-assisted intervention—MICCAI, vol 3

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Aneeq Zia.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zia, A., Sharma, Y., Bettadapura, V. et al. Video and accelerometer-based motion analysis for automated surgical skills assessment. Int J CARS 13, 443–455 (2018). https://doi.org/10.1007/s11548-018-1704-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11548-018-1704-z

Keywords

Navigation