Elsevier

Signal Processing

Volume 86, Issue 12, December 2006, Pages 3674-3695
Signal Processing

Sensory substitution using tactile pin arrays: Human factors, technology and applications

https://doi.org/10.1016/j.sigpro.2006.02.048Get rights and content

Abstract

Tactile arrays use a matrix of individually controllable elements to present spatial and temporal patterns of cutaneous information. Early devices of this type were in the field of sensory substitution to replace vision or hearing for users with a sensory impairment. Many advances have been made due to the appropriation of tactile displays for telerobotics and virtual reality, to represent physical contact with a remote or simulated environment. However, many of these have been limited to engineering prototypes. The recent commercial availability of affordable, portable tactile pin arrays has provided renewed impetus to apply the technology to sensory substitution applications. Lack of access to digitally stored data can prove a significant barrier to blind people seeking careers in numerate disciplines. Tactile displays could potentially provide a discrete and portable means of accessing graphical information in an intuitive non-visual manner. Results are presented from experiments on tactual perception related to understanding graphs and simple visualisations with a commercially available tactile array device. It was found that subjects could discriminate positive or negative line gradient to within ±4.7° of the horizontal, compared to ±3.25° for results with a force feedback mouse and ±2.42° with a raised paper representation.

Introduction

The area of haptic (touch-based) human–computer interaction has grown rapidly over the last few years. A range of new applications has become possible now that touch can be used as an interaction technique. Most research in this area has been concentrated on the use of force feedback devices for the simulation of contact forces when interacting with simulated objects. Devices such as the Phantom haptic interface (see Fig. 1) from Sensable Technologies [1] present the illusion of contact with rigid virtual objects using programmable constraint forces supplied to an end effector such as a thimble, handle or stylus that the user interacts with. A typical device will take the form of a mechanical framework capable of movement along one or more axes (typically two or three for most applications). Movement of the end effector within this space is tracked by the device; this position is then compared to the position of virtual objects within the space. Collisions with these objects are rendered by using actuators (which can be pneumatic, electromechanical, or using a braking system) to constrain the motion of the user to the surfaces of simulated objects.

Broadly speaking, the human haptic sense can be divided in to two distinct channels of sensory experience. Kinaesthetic information refers to the sensation of positions, velocities, forces and constraints that arises from the muscle spindles and tendons. Force feedback haptic interfaces appeal to the kinaesthetic senses by presenting computer controlled resistive forces to create the illusion of contact with a rigid surface. The cutaneous class of sensations refers to those which arise through direct contact with the skin surface. Cutaneous stimulation can be further subdivided in to the sensations of pressure, stretch, vibration, and heat. In some instances pain is also referred to as a separate sensation, though excessive stimulation of the other detectable parameters will lead to a feeling of pain.

Exploring a virtual world through a pen-like stylus or a thimble currently gives little provision for cutaneous sensations in force feedback devices, such as the complex distribution of forces on the skin that are perceived when placing a fingertip on a textured surface. Tactile display research seeks to replicate these sensations by using an array of individually controllable mechanical elements to perturb the skin at the user's fingertip (see Fig. 2). Providing distributed cues for surface qualities greatly expands the scope for potential applications of haptic displays [2]. For example, representing the very fine and subtly varying cues for tactile assessment of fabric is problematic by force feedback alone. Tactile displays could allow shoppers and fashion designers to experience clothing materials via the Internet to aid with online purchasing decisions [3], [4]. Tactile display could also provide increased realism in medical training simulations, allowing the user to differentiate between different types of tissue more easily. This is particularly salient for applications that require direct palpation with the fingertips, rather than being mediated via surgical equipment, such as the diagnosis of bovine pregnancy [5].

As early as the 1920s, researchers were interested in using vibration of the skin as a means of information transfer (for example, Gault in 1926, cited in [7]). Tactile-vision substitution systems (TVSS) were the earliest to be developed, in order to present visual information to blind people. In a typical system, a camera receives visual information which is converted to a tactile representation on a two-dimensional pin array. Some of the earliest works in this area were the development of the Optacon (see Fig. 3), which converted printed letters to a spatially distributed vibrotactile representation on the fingertip, using a miniature handheld camera (summarised in [8]). Although reading speeds were significantly slower than Braille, the Optacon allowed blind people to access any text or graphics without having to wait for it to be converted into Braille. Early pioneering work in TVSS was also performed by Paul Bach-y-rita and colleagues in the late 1960s. Early systems displayed visual information captured by a tripod mounted TV camera to a vibrotactile display on the user's back. Due to limited spatial resolution, tactile masking effects and a low dynamic range, the system was not suitable as a day-to-day navigation aid. However, subjects could easily recognise simple shapes and discriminate orientation of lines. It was also reported that experienced users could perform more complex tasks, such as recognition of faces, or electronic assembly using the system (reported in [9]).

Increases in computing power, the notions of “virtual reality”, and the commercial availability of haptic force feedback devices as a research tool have effectively shifted the main emphasis of tactile display research from representing remotely sensed real-world information to the challenges inherent in interacting with an environment consisting of simulated physical models on a computer. The concept of connecting haptic feedback to a virtual world was first voiced in 1965 by Ivan Sutherland, who put forward a vision for an “ultimate display”. Inspired by Sutherland's vision, Frederick Brooks jr. and his colleagues became the first team to realise force feedback with a graphical environment in the GROPE project (for an excellent treatise of the development of haptic feedback displays the authors recommend [10]). The scope of force feedback enabled applications expanded considerably with the development of the Phantom, the first commercially available desktop force feedback device [1]. The drive to create realistic cutaneous stimulation for virtual environments could now be said to be the motivation behind most tactile display research. In particular, the need to combine tactile displays with force feedback displays has led to increased efforts to miniaturise the technology.

Researchers have sought to appropriate haptic feedback devices for presenting information to visually impaired computer users. For example, Sjostrom outlined a number of guidelines for using haptic technology to provide novel computer interaction techniques for visually impaired people [11]. These include “rules of thumb” for navigation, gaining an overview, understanding objects, haptic widgets and physical interaction. Several projects have focussed on presenting simple visualisation such as graphs, charts and tables using haptic feedback technology. Graphs provide a means by which sighted people can access numerical information in a manner which affords quick, visual identification of trends, maxima, minima, intersection points, and other features that would be laborious and time consuming with a table of numerical data. Fritz and Barner reported the earliest work that used a Phantom to present graphs to visually impaired users [12]. Wies et al. reported work using the Logitech Wingman force feedback mouse to present graphical information [13]. The most extensive body of work in this area was performed by Yu, Brewster and colleagues on the Multivis project (www.multivis.org). This project adopted a multimodal approach to presenting visualisations using force feedback and stereo sound. In a series of experiments, multimodal feedback was found to offer significant advantages over a single modality alone [14], [15], [16].

Due to their more compact size and power requirements, tactile displays offer a potentially much more discrete, affordable means of providing access to data visualisations via the sense of touch. Displays are often small enough to allow mounting on another, standard human–computer interaction device such as a mouse, keyboard or games controller, or portable devices such as mobile phones and personal digital assistants (PDAs). This paper reviews the developments in tactile displays for sensory substitution, along with relevant literature on perceptual and psychophysical issues related to the sense of touch. An experiment is described investigating perceptual issues related to graph exploration with a commercially available tactile display device.

Section snippets

Methods of tactile presentation for sensory substitution

The earliest structured work in sensory substitution dates back to the 19th century, with the development of embossed raised characters for visually impaired people. Many “low-tech.” solutions are still employed for presenting visualisations of numerical data, although they are being gradually augmented by microprocessor-based technology. In recent years several technologies have emerged which can be used to present information to the sense of touch, including vibration (vibrotactile), force

Human tactile perception

The physiological basis of haptic perception carries profound implications for the design of tactile display devices. Similarly, many relevant psychophysical studies have been conducted, which may indicate the degree of information that can be transmitted by mechanical perturbation of the skin. Here we discuss factors arising from neurophysiological, psychophysical and perceptual studies of the sense of touch.

Sensory substitution

The process of sensory substitution involves the sensing of stimuli by electronic means, transformation of the stimulus via signal processing, and presentation of the transformed stimulus in another modality. The main application of these systems is increasing accessibility for those with sensory impairments. The earliest sensory substitution devices converted visual stimuli to tactile representations for the blind and visually impaired. There are also examples of visual-to-auditory [56] and

Teleoperation and virtual environments

Researchers became interested in the idea of computer mediated tactile sensations through the field of teleoperation. Teleoperation is concerned with human control of a remote slave device (typically a robot manipulator arm) using a local master device. The user interacts with the master device, and his/her movements are communicated to the slave, and subsequently replicated. This allows the user access to perform complex manipulation tasks in hazardous environments (underwater maintenance,

Tactile display for multimodal human–computer interaction

The mid-1990s also saw tactile displays used within desktop interaction for the first time. Little of this work focussed on tactile pin arrays; rather, single pin or vibrotactile output was more common. The aim of this work generally has been to supplement the graphical feedback from current computer systems with another form of output. This is often done because the visual sense is overloaded and users can miss information if presented graphically; the tactile sense is underutilised and so is

Experiment

Tactile displays clearly offer great potential for enabling visually impaired people access to digitally stored and manipulated data. The most commonly employed solutions for access at present include screen readers, screen magnifiers, Braille displays, and producing tactile diagrams with heat-raised paper. There are several drawbacks with these methods in that they are either unable to respond quickly to dynamic changes in data (hard copies need to be produced of heat-raised diagrams), they

Technology

Future developments in tactile display technology will likely lead to devices of reduced size and cost and increased resolution of pins (smaller spacing between adjacent elements). In particular, for virtual environment and teleoperation applications seeking to create realistic, simulated textures, displays of a resolution approaching that of the human sensory system will be necessary to create realistic feeling textures. Reducing the size and cost of devices will allow them to be more easily

References (100)

  • J.C. Craig

    Tactile pattern perception and its perturbations, J. Acoust. Soc. Am.

    (1985)
  • J.C. Craig et al.

    Dynamic tactile displays

  • K.A. Kaczmarek, P. Bach-y-rita, Tactile displays, in: W. Barfield, T.A. Furness, (Eds.), Virtual environments and...
  • G.C. Burdea

    Force and Touch Feedback for Virtual Reality

    (1996)
  • C. Sjostrom, Using haptics in computer interfaces for blind people, CHI 2001, Seattle, WA, 31 March–5 April 2001, pp....
  • J.P. Fritz, K. Barner, Design of a haptic graphing system, 19th RESNA Conference, Salt Lake City, UT,...
  • E. Wies et al.

    Web-based touch display for accessible science education

  • W. Yu, S. Brewster, Comparing two haptic interfaces for multimodal graph rendering, IEEE VR2002, 10th Symposium on...
  • W. Yu et al.

    Multimodal virtual reality versus printed medium in visualization for blind people

    (2002)
  • W. Yu et al.

    Evaluation of multimodal graphs for blind people

    J. Universal Access Inform. Soc.

    (2003)
  • E. Foulke et al.
  • W. Gaver

    The sonicfinder: an interface that uses auditory icons

    Hum–Comput. Interact.

    (1989)
  • H. Petrie, V. Johnson, P. McNally, S. Morley, A.M. O’Neill, D. Majoe, Inexpensive tactile interaction for blind...
  • B.P. Challis et al.

    Design principles for tactile interaction

  • L.R. Wells et al.

    Merging of tactile sensory input and audio data by means of the talking tactile tablet

    (2003)
  • S. Brewster, L.M. Brown, Tactons: structured tactile messages for non-visual information display, 5th Australian User...
  • L.M. Brown et al.

    A first investigation into the effectiveness of tactons

    (2005)
  • J. Linjama et al.

    Novel, minimalist haptic gesture interaction for mobile devices

    (2004)
  • R.L. Klatzky et al.

    Tactile roughness perception with a rigid link interposed between skin and surface

    Percept. Psychophys.

    (1999)
  • J.M. Loomis

    Distal attribution and presence

    Presence: Teleoperator. Virtual Environ.

    (1992)
  • M. Minsky et al.

    Feeling and seeing: issues in force display

    Comput. Graph.

    (1990)
  • T.H. Massie

    Initial Haptic Exploration With the Phantom: Virtual Touch Through Point Interaction

    (1996)
  • J.P. Fritz et al.

    Stochastic models for haptic texture

    (1996)
  • C. Basdogan, C. Ho, M.A. Srinivasan, A ray-based haptic rendering technique for displaying shape and texture of 3D...
  • S.A. Wall, W.S. Harwin, Modelling of surface identifying characteristics using Fourier series, Proceedings of ASME...
  • S. Choi, H.Z. Tan, An analysis of perceptual instability during haptic texture rendering, 10th International Symposium...
  • K.A. Kaczmarek et al.

    Electrotactile and vibrotactile displays for sensory substitution systems

    IEEE Trans. Biomed. Eng.

    (1991)
  • L.A. Jones, M. Berris, Material discrimination and thermal perception, 11th Symposium on Haptic Interfaces for Virtual...
  • D.G. Caldwell, C. Gosney, Enhanced tactile feedback (tele-taction) using a multi-functional sensory system, IEEE...
  • I.R. Summers et al.

    A broadband tactile array on the fingertip

    J. Acoust. Soc. Am.

    (2002)
  • J. Pasquero, V. Hayward, Stress: A practical tactile display with one millimeter spatial resolution and 700Hz refresh...
  • D.G. Caldwell, N. Tsagarakis, C. Giesler, An integrated tactile/shear feedback array for stimulation of finger...
  • J.M. Loomis et al.

    Tactual perception

  • K.O. Johnson et al.

    Neural mechanisms of tactual form and texture perception

    Annu. Rev. Neurosci.

    (1992)
  • D.A. Kontarinis, R.D. Howe, Display of High Frequency Tactile Information to Teleoperators, 1993, pp....
  • K.O. Johnson

    Neural basis of haptic perception

  • R.T. Verillo

    Effect of contactor area on vibrotactile threshold

    J. Acoust. Soc. Am.

    (1963)
  • C.H. Rogers

    Choice of stimulator frequency for tactile arrays

    IEEE Trans. Man Machine Systems MMS-

    (1970)
  • S.J. Bolanowski et al.

    Four channels mediate the mechanical aspects of touch

    J. Acoust. Soc. Am.

    (1988)
  • C.E. Sherrick et al.

    The localization of low- and high-frequency vibrotactile stimuli

    J. Acoust. Soc. Am.

    (1990)
  • Cited by (81)

    • Wearable haptics

      2020, Wearable Sensors: Fundamentals, Implementation and Applications
    • A case of visuo-auditory sensory substitution in rats

      2019, Behavioural Processes
      Citation Excerpt :

      The term « sensory substitution » refers to a process whereby an agent, by means of a removable specialized instrumentation, becomes capable of exploiting an available sensory modality in order to perceive properties of the environment which are normally accessible by means of a different modality which is temporarily or definitively unavailable (Bach-y-Rita, 2004; Bach-Y-Rita et al., 1969; Gapenne, 2014; Lenay et al., 2003; Visell, 2009; Wall and Brewster, 2006).

    • Making hospital environment friendly for people: A concept of HMI

      2023, Human Machine Interface: Making Healthcare Digital
    • Grapho: Bringing line chart accessibility to the visually impaired

      2022, ACM International Conference Proceeding Series
    View all citing articles on Scopus
    View full text