Virtual Spinal Tap: using haptic data to learn procedures with feel

Distributed medical education programs have trouble providing suitable resources for certain forms of clinical training. Certain procedures are harder to learn because watching a skilled practitioner tells the learner little. Using a dual-control haptic device interface, and anatomically accurate virtual spinal models, we were able to create a Virtual Spinal Tap to improve the verisimilitude of an online learning experience. The devices and interfaces provided the means to measure and reproduce the subtle forces involved in a procedure that is learned by feel. Integration of this model with other simulation modalities would enhance its applicability.


Introduction
The Northern Ontario School of Medicine, the first new medical school in Canada in 35 years, was established with a focus on distributed medical education (DME) for remote northern and rural communities. DME poses many challenges, both pedagogically and logistically. This is especially true for teaching surgical procedures where there is a clear need for remote teaching and evaluation of skills and praxis. Many important clinical procedures that are urgently needed in rural communities, such as lumbar puncture, joint aspirations, ocular procedures, or even burr holes, are seen too infrequently for learners to acquire or practitioners to retain a good practical feel. (Carr, 2005;Cohen et al., 2006;Luft, Bunker, & Enthoven, 1979;Park, Witzke, & Donnelly, 2002) Distances in the North are great, plus it is not always feasible to ask much-needed clinicians to leave their communities for additional training. Some groups have been successful in taking simulation training on the road to small communities but poor roads and inclement weather pose additional barriers. It is particularly difficult to ask Topps D, Korolenko M, de Domenico J, Newhouse D MedEdPublish https://doi.org/10.15694/mep.2018.0000076.1 Page | 2 surgeon teachers with particularly scarce skills to waste time touring, and travelling road show crews have seen high rates of burnout. (King, Moseley, Hindenlang, & Kuritz, 2008) To address some of these problems, our research team set out to explore the use of haptic simulators for teaching surgical procedure or skills remotely. In particular, we were looking to design and implement a system that was modular, mobile and hardened for travel.
There is a long history of exploratory and pilot projects looking at the use of haptic simulators for teaching procedures such as lumbar puncture -some of the articles date back to 1985. There is reasonable evidence to support the use of haptics for procedural training. (van der Meijden & Schijven, 2009;Westebring-van der, Goossens, Jakimowicz, & Dankelman, 2008) Haptics, early in training, have been known to make a significant difference in skill acquisition. (Strom et al., 2006) Cao et al have shown that haptic feedback improves skill acquisition when learners are tested under additional cognitive loading. (Cao, Zhou, Jones, & Schwaitzberg, 2007), and many studies have shown high degrees of learner and teacher satisfaction when haptic simulators are employed.

Methodology
Because this is a newly developing field, where much of the work is exploratory, we felt that it was premature to use traditional techniques such as a randomized controlled trial. Rather, we pursued an design-based research approach (Design-Based Research Collective, 2003) examining challenges in the development and implementation of remote haptic simulation, particularly with regard to both the technical and pedagogical aspects. We received funding from the Canadian Foundation for Innovation (CFI), which supports infrastructure for research in virtual reality. We engaged various groups of undergraduate medical students to provide program evaluation at different stages of the project, through convenience sampling. We employed a SharePoint Team Site for the collaborative aspects of the project, particularly in relation to ongoing field notes through several iterations of the learning designs, project documentation, participant surveys and a wiki on technical issues.

Results
Our first concern was that of simple distance. Previous research has shown that haptic feedback signals need to be cycled at around 1000 Hz in order to establish realistic sensation. (Marshall, Yap, & Yu, 2008) At this frequency, the speed of light becomes an issue, limiting direct cycles to less than 100 km, which would be too short for our needs. This is a well-known problem and we were pleased to find some tools that used local caching and algorithmic interpolation of the signals to address this problem. Initially, we selected a package from Handshake VR (http://olab.ca/handshakevr-haptic-toolkit/, which provided us with smooth signal interpolation, together with an attractive rapid application design (RAD) interface. See Figure 1. By using a shared set of Handshake VR packages, including the Phantom Omni haptic device, we found that we were able to set up interacting dual controls with haptic feedback over a distance. This enabled a powerful learning tool where the teacher can feel what the learner is doing and provide guidance. This is a crucial asset in the learning of some procedures: you can never lay a hand on the hand of somebody else when doing a lumbar puncture and expect to feel what they are doing without completely spoiling their ability to sense underlying tissue structures. The Handshake VR dual control allowed us to selectively sense what the learner was feeling without interfering with their sensation.
The dual channel interactivity between the paired devices can be selectively controlled: in Teacher Mode, the master device could be used to apply guidance and subtle feedback, helping the learner to find the optimum path through the tissues; in the Examiner Mode, the master device would sense all that the learner was feeling but without providing any haptic feedback towards the learner, whether intentional or not. This dual control capability was tested with teachers and learners in a variety of situations, sometimes in the same room, but also between our two main teaching sites, separated by 1200 km. There was no appreciable lag in feedback, despite the increased distance. Both teachers and learners evaluated this capability very highly. "This is a game changer", quoted one highly impressed teacher.
We created the 3-D volumetric model of the lumbar spine by using data from a DICOM image set, taken from the CT scan of a real patient, and modeled using a toolset from Akamai (www.akamai.com). Using this approach produces a much more realistic model than simply using a classic artist's impression -it has real anatomical variability, can more easily incorporate real anatomical pathology, and can also more easily generate additional anatomical features such as osteophytes, osteoporosis or old fractures. Using CT data gave us very fine resolution of the bony detail, which was ideal for our purposes, but MRI data can be used as well, which provides improved complexity for soft tissue structures. The Phantom Omni haptic device ( Figure 2) that we used has 6 Degrees of Freedom. Of these, it only has active resistance with 3 Degrees of Freedom. As other similar projects found, this produces the disconcerting fact that you can freely wobble the stylus around, no matter how deeply it is inserted into the virtual tissues. In order to eliminate this effect, one either has to produce motion resistance in the remaining 3Degrees of Freedom, or physically anchor the entry point of the needle. We explored the use of paired haptic devices to provide this additional motion restriction in the other 3 Degrees of Freedom but found that this was not practical or cost-effective.
We experimented with a number of rubber membranes to represent the skin of the patient's back and to provide the physical fixation point at the entry site of the virtual needle. We were unable to find a material that provided good stabilization, while allowing user choice of entry point. We note that others such as Haluck et al have had some success with this. (Haluck, Webster, Mohler, & Melkonian, 2000) We experimented with a variety of user visualizations -the display portrayed of the model. Initially we felt that stereoscopic viewing would be an essential feature but we quickly found from user feedback that users could quite easily mentally translate a rotatable 3-D model displayed on a 2-D screen. They did not find that stereoscopic viewing produced much more than a novelty effect. As several users pointed out, one does not have the luxury of looking at needle placement in real life and that it was more important to learn needle guidance by feel alone. We had more success with providing the simple option of making the virtual skin translucent or opaque. Early learners appreciated being able to see roughly what the underlying bony model looked like. Once they were comfortable at being able to find a workable needle path for the lumbar puncture model, they then appreciated the skin being made opaque on-screen and having to reproduce that needle path by feel alone. We also provided learners with the ability to select an orthogonal view of the model -in effect, they would be looking laterally, which they sometimes found helpful in assessing needle depth. However this feature was much less used, and occasionally proved confusing to some learners.
A few additional operational issues showed up in testing and implementation. For example it is crucial to exactly align for the user both the visual display and haptic angles of attack for the needle device. Such calibration was typically only needed once at the start of a learning session. Operating paired devices over a distance required the opening of specific ports on the university firewalls. The haptic devices themselves are now reasonably robust but still require supervision during the setup phase. (See Appendix 1.) We highlight these particular issues because, while they are common to most laboratory setups and easily solved therein, they pose far greater challenges in the educational environment that is our main objective: providing remote simulation opportunities to northern and rural communities.

Metrics
One aspect of working directly with a haptic device such as the Phantom Omni is that we had access to sophisticated metrics on user performance, including positional and force data provided on a real-time basis. At first, we attempted to use this data flow along with Hidden Markov Models in an attempt to measure the differences between optimal paths as derived from expert users and those paths generated by neophyte learners. While this approach looked promising, we quickly found that our experts preferred working directly with their paired learners and that they trusted their own subjective analysis more. Given that teachers are a scarce resource in such a unique area, we briefly explored the possibilities of digitally recording user-generated paths so that teachers could review their progress on a time shifted basis. For our particular application in remote teaching, this approach looked promising but we were unable to complete this analysis within the funding period allowed.
As part of a related project connected to the Health Services Virtual Organisation (HSVO, http://olab.ca/hsvo-health-services-virtual-organization/), we also explored the utility of connecting these haptic devices to the HSVO Savoir bus interface. R. H. Ellaway, Cooperstock, & Spencer, 2010;Spencer, Copeland, Harzallah, Hickey, & Liu, 2009) Although the user controlled light paths, available through Savoir, provide high bandwidth and volume of data throughput, the cycle turnaround and delays were sufficient to limit the usefulness of this approach. However we did have some success with connecting our haptic devices to OpenLabyrinth virtual patients(Rachel Ellaway, Joy, Topps, & Lachapelle, 2013), such that the starting parameters such as device position, initial resistances, relevant anatomical models to load, could all be made available at the time of device launch. This would make it easier for teachers to set up models in remote labs, ready for learners to use when needed, and the virtual patients would provide clinical context so that the learners were familiar with the clinical reasoning behind the procedure to be learned. (http://olab.ca/virtual-spinal-tap-scenario) User satisfaction collected from both teachers and learners via surveys and focus groups was uniformly high. With only 16 surveys, none of the results achieved statistical significance but it was clear that both participant groups were enthusiastic and appreciative of the capabilities of haptic devices in remote teaching of surgical procedures. Many users, particularly teachers, particularly liked the dual control aspect of paired haptic devices, expressing that this presented an entirely new modality for teaching procedures where students have to learn the feel of the procedure. Topps D, Korolenko M, de Domenico J, Newhouse D MedEdPublish https://doi.org/10.15694/mep.2018

Economics
One element of this project was to explore the possibilities of implementing these devices in remote teaching situations. Several levels of haptic device were tested, including newer consumer level haptic devices such as the Novint Falcon (https://cs.stanford.edu/people/conti/falcon.html). Although these $300 consumer devices were much more attractive from a cost perspective, their sensitivity and utility were insufficient for our needs. The Phantom Omni device was a more reasonable compromise in this respect.
Software costs were also an issue. Although the Handshake VR rapid application development platform was very effective for prototyping, the cost of a more widespread multiple site license quickly became prohibitive. Part of this cost was the MatLab licence (https://www.mathworks.com/products/matlab.html), which is often now available as a site licence for universities. We extensively explored alternative languages and development tools, such as CHAI from Stanford. (Conti et al., 2003) While these tools are significantly cheaper, or even free as Open Source, the development time required when using these tools quickly became prohibitive. New development tools such as X3D or H3D (http://www.h3dapi.org/modules/mediawiki/index.php/Beginner_X3D_and_H3DAPI ) looked promising but we were unable to assess these within the time constraints of the project.

Conclusions
Overall, our project uncovered some exciting areas of development and illustrated huge potential for the use of haptic devices, especially in paired mode, for remote procedural teaching. However, it was also clear that a number of areas need to be strengthened before this is practically viable. First among these would be the availability of haptic application development software that provides runtime licensing that is affordable when scaled across multiple sites. Connecting these devices into other projects through the use of network enabled platforms such as that being developed by HSVO looks very promising but is still in its infancy.

Take Home Messages
Some procedures cannot be learned by watching. Haptics-based simulation is useful for some procedures, such as the spinal tap.
Paired haptic controls allow the teacher to feel what the learner is feeling.
Integrating haptics-based simulation with other simulation modalities in blended scenarios affords a more holistic approach.

Notes On Contributors
This work was carried out over a decade ago but the data files were lost. This report is based on resurrected data.
David Topps designed the project and wrote the manuscript.
Mike Korolenko managed the project and contributed to the manuscript.
Jamie de Domenico provided data and coding assistance in the project.
Donna Newhouse provided subject matter expertise on the virtual anatomy aspects of the project.