Sensory substitution using tactile pin arrays: Human factors, technology and applications
Introduction
The area of haptic (touch-based) human–computer interaction has grown rapidly over the last few years. A range of new applications has become possible now that touch can be used as an interaction technique. Most research in this area has been concentrated on the use of force feedback devices for the simulation of contact forces when interacting with simulated objects. Devices such as the Phantom haptic interface (see Fig. 1) from Sensable Technologies [1] present the illusion of contact with rigid virtual objects using programmable constraint forces supplied to an end effector such as a thimble, handle or stylus that the user interacts with. A typical device will take the form of a mechanical framework capable of movement along one or more axes (typically two or three for most applications). Movement of the end effector within this space is tracked by the device; this position is then compared to the position of virtual objects within the space. Collisions with these objects are rendered by using actuators (which can be pneumatic, electromechanical, or using a braking system) to constrain the motion of the user to the surfaces of simulated objects.
Broadly speaking, the human haptic sense can be divided in to two distinct channels of sensory experience. Kinaesthetic information refers to the sensation of positions, velocities, forces and constraints that arises from the muscle spindles and tendons. Force feedback haptic interfaces appeal to the kinaesthetic senses by presenting computer controlled resistive forces to create the illusion of contact with a rigid surface. The cutaneous class of sensations refers to those which arise through direct contact with the skin surface. Cutaneous stimulation can be further subdivided in to the sensations of pressure, stretch, vibration, and heat. In some instances pain is also referred to as a separate sensation, though excessive stimulation of the other detectable parameters will lead to a feeling of pain.
Exploring a virtual world through a pen-like stylus or a thimble currently gives little provision for cutaneous sensations in force feedback devices, such as the complex distribution of forces on the skin that are perceived when placing a fingertip on a textured surface. Tactile display research seeks to replicate these sensations by using an array of individually controllable mechanical elements to perturb the skin at the user's fingertip (see Fig. 2). Providing distributed cues for surface qualities greatly expands the scope for potential applications of haptic displays [2]. For example, representing the very fine and subtly varying cues for tactile assessment of fabric is problematic by force feedback alone. Tactile displays could allow shoppers and fashion designers to experience clothing materials via the Internet to aid with online purchasing decisions [3], [4]. Tactile display could also provide increased realism in medical training simulations, allowing the user to differentiate between different types of tissue more easily. This is particularly salient for applications that require direct palpation with the fingertips, rather than being mediated via surgical equipment, such as the diagnosis of bovine pregnancy [5].
As early as the 1920s, researchers were interested in using vibration of the skin as a means of information transfer (for example, Gault in 1926, cited in [7]). Tactile-vision substitution systems (TVSS) were the earliest to be developed, in order to present visual information to blind people. In a typical system, a camera receives visual information which is converted to a tactile representation on a two-dimensional pin array. Some of the earliest works in this area were the development of the Optacon (see Fig. 3), which converted printed letters to a spatially distributed vibrotactile representation on the fingertip, using a miniature handheld camera (summarised in [8]). Although reading speeds were significantly slower than Braille, the Optacon allowed blind people to access any text or graphics without having to wait for it to be converted into Braille. Early pioneering work in TVSS was also performed by Paul Bach-y-rita and colleagues in the late 1960s. Early systems displayed visual information captured by a tripod mounted TV camera to a vibrotactile display on the user's back. Due to limited spatial resolution, tactile masking effects and a low dynamic range, the system was not suitable as a day-to-day navigation aid. However, subjects could easily recognise simple shapes and discriminate orientation of lines. It was also reported that experienced users could perform more complex tasks, such as recognition of faces, or electronic assembly using the system (reported in [9]).
Increases in computing power, the notions of “virtual reality”, and the commercial availability of haptic force feedback devices as a research tool have effectively shifted the main emphasis of tactile display research from representing remotely sensed real-world information to the challenges inherent in interacting with an environment consisting of simulated physical models on a computer. The concept of connecting haptic feedback to a virtual world was first voiced in 1965 by Ivan Sutherland, who put forward a vision for an “ultimate display”. Inspired by Sutherland's vision, Frederick Brooks jr. and his colleagues became the first team to realise force feedback with a graphical environment in the GROPE project (for an excellent treatise of the development of haptic feedback displays the authors recommend [10]). The scope of force feedback enabled applications expanded considerably with the development of the Phantom, the first commercially available desktop force feedback device [1]. The drive to create realistic cutaneous stimulation for virtual environments could now be said to be the motivation behind most tactile display research. In particular, the need to combine tactile displays with force feedback displays has led to increased efforts to miniaturise the technology.
Researchers have sought to appropriate haptic feedback devices for presenting information to visually impaired computer users. For example, Sjostrom outlined a number of guidelines for using haptic technology to provide novel computer interaction techniques for visually impaired people [11]. These include “rules of thumb” for navigation, gaining an overview, understanding objects, haptic widgets and physical interaction. Several projects have focussed on presenting simple visualisation such as graphs, charts and tables using haptic feedback technology. Graphs provide a means by which sighted people can access numerical information in a manner which affords quick, visual identification of trends, maxima, minima, intersection points, and other features that would be laborious and time consuming with a table of numerical data. Fritz and Barner reported the earliest work that used a Phantom to present graphs to visually impaired users [12]. Wies et al. reported work using the Logitech Wingman force feedback mouse to present graphical information [13]. The most extensive body of work in this area was performed by Yu, Brewster and colleagues on the Multivis project (www.multivis.org). This project adopted a multimodal approach to presenting visualisations using force feedback and stereo sound. In a series of experiments, multimodal feedback was found to offer significant advantages over a single modality alone [14], [15], [16].
Due to their more compact size and power requirements, tactile displays offer a potentially much more discrete, affordable means of providing access to data visualisations via the sense of touch. Displays are often small enough to allow mounting on another, standard human–computer interaction device such as a mouse, keyboard or games controller, or portable devices such as mobile phones and personal digital assistants (PDAs). This paper reviews the developments in tactile displays for sensory substitution, along with relevant literature on perceptual and psychophysical issues related to the sense of touch. An experiment is described investigating perceptual issues related to graph exploration with a commercially available tactile display device.
Section snippets
Methods of tactile presentation for sensory substitution
The earliest structured work in sensory substitution dates back to the 19th century, with the development of embossed raised characters for visually impaired people. Many “low-tech.” solutions are still employed for presenting visualisations of numerical data, although they are being gradually augmented by microprocessor-based technology. In recent years several technologies have emerged which can be used to present information to the sense of touch, including vibration (vibrotactile), force
Human tactile perception
The physiological basis of haptic perception carries profound implications for the design of tactile display devices. Similarly, many relevant psychophysical studies have been conducted, which may indicate the degree of information that can be transmitted by mechanical perturbation of the skin. Here we discuss factors arising from neurophysiological, psychophysical and perceptual studies of the sense of touch.
Sensory substitution
The process of sensory substitution involves the sensing of stimuli by electronic means, transformation of the stimulus via signal processing, and presentation of the transformed stimulus in another modality. The main application of these systems is increasing accessibility for those with sensory impairments. The earliest sensory substitution devices converted visual stimuli to tactile representations for the blind and visually impaired. There are also examples of visual-to-auditory [56] and
Teleoperation and virtual environments
Researchers became interested in the idea of computer mediated tactile sensations through the field of teleoperation. Teleoperation is concerned with human control of a remote slave device (typically a robot manipulator arm) using a local master device. The user interacts with the master device, and his/her movements are communicated to the slave, and subsequently replicated. This allows the user access to perform complex manipulation tasks in hazardous environments (underwater maintenance,
Tactile display for multimodal human–computer interaction
The mid-1990s also saw tactile displays used within desktop interaction for the first time. Little of this work focussed on tactile pin arrays; rather, single pin or vibrotactile output was more common. The aim of this work generally has been to supplement the graphical feedback from current computer systems with another form of output. This is often done because the visual sense is overloaded and users can miss information if presented graphically; the tactile sense is underutilised and so is
Experiment
Tactile displays clearly offer great potential for enabling visually impaired people access to digitally stored and manipulated data. The most commonly employed solutions for access at present include screen readers, screen magnifiers, Braille displays, and producing tactile diagrams with heat-raised paper. There are several drawbacks with these methods in that they are either unable to respond quickly to dynamic changes in data (hard copies need to be produced of heat-raised diagrams), they
Technology
Future developments in tactile display technology will likely lead to devices of reduced size and cost and increased resolution of pins (smaller spacing between adjacent elements). In particular, for virtual environment and teleoperation applications seeking to create realistic, simulated textures, displays of a resolution approaching that of the human sensory system will be necessary to create realistic feeling textures. Reducing the size and cost of devices will allow them to be more easily
References (100)
- et al.
Haptics in virtual environments: taxonomy, research status, and challenges
Comput. Graph.
(1997) - et al.
Responses of mechanoreceptive afferent units in the glabrous skin of the human hand to sinusoidal skin displacements
Brain Res.
(1982) - et al.
Regional differences in sensitivity to vibration in the glabrous skin of the human hand. Brain Res.
(1984) - et al.
Towards a standard for pointing device evaluation: Perspectives on 27 years of fitts’ law research in hci
Int. J. Human-Comput. Stud.
(2004) - et al.
Movement characteristics using a mouse with tactile and force feedback
Int. J. Human-Comput. Stud.
(1996) - et al.
The phantom haptic interface: a device for probing virtual objects
(1994) - M. Govindaraj, A. Garg, A. Raheja, G. Huang, D. Metaxas, Haptic simulation of fabric hand, Eurohaptics, Dublin,...
- et al.
Sensing the fabric: to simulate sensation through sensory evaluation and in response to standard acceptable properties of specific materials when viewed as a digital image
- et al.
Preliminary development and evaluation of a bovine rectal palpation stimulator for training veterinary students. Cattle Practice
J. Br. Cattle Vet. Assoc.
(2003) - I.R. Summers, C.M. Chanter, A.L. Southall, A.C. Brady, Results from a tactile array on the fingertip, Eurohaptics,...
Tactile pattern perception and its perturbations, J. Acoust. Soc. Am.
Dynamic tactile displays
Force and Touch Feedback for Virtual Reality
Web-based touch display for accessible science education
Multimodal virtual reality versus printed medium in visualization for blind people
Evaluation of multimodal graphs for blind people
J. Universal Access Inform. Soc.
The sonicfinder: an interface that uses auditory icons
Hum–Comput. Interact.
Design principles for tactile interaction
Merging of tactile sensory input and audio data by means of the talking tactile tablet
A first investigation into the effectiveness of tactons
Novel, minimalist haptic gesture interaction for mobile devices
Tactile roughness perception with a rigid link interposed between skin and surface
Percept. Psychophys.
Distal attribution and presence
Presence: Teleoperator. Virtual Environ.
Feeling and seeing: issues in force display
Comput. Graph.
Initial Haptic Exploration With the Phantom: Virtual Touch Through Point Interaction
Stochastic models for haptic texture
Electrotactile and vibrotactile displays for sensory substitution systems
IEEE Trans. Biomed. Eng.
A broadband tactile array on the fingertip
J. Acoust. Soc. Am.
Tactual perception
Neural mechanisms of tactual form and texture perception
Annu. Rev. Neurosci.
Neural basis of haptic perception
Effect of contactor area on vibrotactile threshold
J. Acoust. Soc. Am.
Choice of stimulator frequency for tactile arrays
IEEE Trans. Man Machine Systems MMS-
Four channels mediate the mechanical aspects of touch
J. Acoust. Soc. Am.
The localization of low- and high-frequency vibrotactile stimuli
J. Acoust. Soc. Am.
Cited by (81)
Wearable haptics
2020, Wearable Sensors: Fundamentals, Implementation and ApplicationsA case of visuo-auditory sensory substitution in rats
2019, Behavioural ProcessesCitation Excerpt :The term « sensory substitution » refers to a process whereby an agent, by means of a removable specialized instrumentation, becomes capable of exploiting an available sensory modality in order to perceive properties of the environment which are normally accessible by means of a different modality which is temporarily or definitively unavailable (Bach-y-Rita, 2004; Bach-Y-Rita et al., 1969; Gapenne, 2014; Lenay et al., 2003; Visell, 2009; Wall and Brewster, 2006).
Making hospital environment friendly for people: A concept of HMI
2023, Human Machine Interface: Making Healthcare DigitalGrapho: Bringing line chart accessibility to the visually impaired
2022, ACM International Conference Proceeding SeriesVirtual Haptic Perception as an Educational Assistive Technology: A Case Study in Inclusive Education
2021, IEEE Transactions on HapticsDevelopments in the human machine interface technologies and their applications: a review
2021, Journal of Medical Engineering and Technology