Lateralized irrelevant speech alters visuospatial selective attention mechanisms

https://doi.org/10.1016/j.biopsycho.2005.07.007Get rights and content

Abstract

Recent studies indicate that the coordination of spatial attention across modalities may in part be mediated by a supramodal attentional system. We try to extend the concept of a supramodal system and hypothesized that involuntary modulations of auditory attentional processes by irrelevant speech signals influence visuospatial attention, suggesting crossmodal links between vision and speech. In order to test this we recorded event-related brain potentials (ERPs) of 12 healthy subjects in a visuospatial selective attention task. The task to identify target stimuli appearing at lateral visual field locations caused the expected enhancements of the early P1 and N1 ERP components to attended visual stimuli. Understandable and ununderstandable task irrelevant speech was presented either at the visually attended position or in the opposite visual field location. Speech contralateral to unattended visual stimuli led to a decreased N1 amplitude. This effect was stronger for understandable speech. Thus, speech influences the allocation of visual spatial attention if it is presented in the unattended location. The results suggest crossmodal links of speech and visuospatial attention mechanisms at a very early stage of human perception.

Introduction

In everyday life we are not only exposed to stimuli of a single modality, but rather, most of the time, events are multimodal, i.e. they provide adequate input to more than one sensory system. In many situations, the adaptive control of behavior requires the integration and coordination of information about relevant objects and events that originates from different input modalities, but from overlapping spatial locations. The identification of the underlying neural mechanisms remains a key challenge for neurobiological research. A number of studies have shown that there are crossmodal links between the auditory and the visual modality (e.g., Spence and Driver, 1996, Eimer and van Velzen, 2002).

Within one modality, stimuli which are relevant to a given task are processed with greater speed and accuracy than task irrelevant stimuli (e.g., Posner, 1980). ERPs have shown that endogenous (voluntary) direction of attention to a location in space influences neural processes starting less than 100 ms after stimulus presentation (e.g. Eason et al., 1969, Heinze et al., 1990). This effect has been interpreted to indicate changed perceptual processes (Downing, 1988, Eriksen and James, 1986) and has been termed sensory gain control mechanism (Eason, 1981, Harter and Aine, 1984, Hillyard and Mangun, 1987). The sensory gain control mechanism operates within modality-specific brain areas but interactions between attentional mechanisms within different modalities have also been demonstrated (e.g. Eimer, 2001, Eimer and Driver, 2000, Hötting et al., 2003).

Several behavioral studies have found evidence for crossmodal interactions in endogenous spatial attention between vision and audition. In most of these experiments, participants had to direct their attention to the expected location of the target stimuli. On a minority of trials stimuli of a different modality were presented. Superior performance for stimuli at the expected location was observed for both modalities, suggesting that the focus of attention within one modality may influence the processing of information in other modalities (e.g., Spence and Driver, 1996).

However, these behavioral studies show crossmodal interactions in voluntary spatial attention between vision and audition, but they do not allow any strong conclusions with respect to which stages in the processing of visual and auditory information are affected by such links. Hence, they do not give any direct insight into the neural processes associated with such interactions (Eimer and van Velzen, 2002). To examine the neural correlates of crossmodal attentional interactions, a number of ERP studies have been conducted (Alho et al., 1992, Woods et al., 1992, Spence and Driver, 1996, Spence and Driver, 1997, Eimer and Schröger, 1998, Luo and Wei, 1999). Their results suggest that the coordination of spatial attention across modalities may in part be mediated by a common supramodal attentional control system. For example, Eimer (1999a) investigated ERP effects of crossmodal attention between vision and audition. It required the detection of auditory and visual stimuli at the same or at opposite visual field locations. Eimer found attentional modulations of the early (visual) P1 component in the “attend same” but not in the “attend opposite” condition whereas the later (auditory) Nd component did show an attentional modulation even in the “attend opposite” condition.

Supramodal attentional processes seem to incorporate not only visual and audition modalities. Behavioral studies have found evidence for crossmodal interactions in endogenous spatial attention between vision and touch (Spence et al., 2000). Recently, ERP studies confirm these results and give evidence for crossmodal links in voluntary spatial attention between vision and touch (Eimer and Driver, 2000, Eimer and van Velzen, 2003), between tactile and auditory stimuli (Hötting et al., 2003) and between vision, touch and auditory stimuli (Eimer et al., 2002). Overall, these ERP results suggest that relatively early stages of visual, auditory, and tactile information processing can be affected by crossmodal interactions.

The aim of the present study was to extend the concept of a supramodal attention system by investigating crossmodal links between visuospatial attention and speech. We hypothesized that modulations of attentional processes in one modality may influence attentional processes in a different modality via the control system. Auditory distractors are known to interfere with visuospatial selection processes (Spence and Driver, 1997). Thus, task irrelevant auditory stimuli of a high ecological validity like speech may influence visuospatial selection processes if presented at the (visually) attended location as compared to the opposite location.

To this end we recorded event-related brain potentials in a paradigm that combined a visuospatial selective attention task with the irrelevant speech effect. The irrelevant speech effect refers to an impaired recall of visually presented items when task-irrelevant speech is presented in the background at the time of encoding (Salamé and Baddeley, 1982). It has been shown that meaningful speech disrupts recall more than does meaningless speech (LeCompte et al., 1997, Neely and LeCompte, 1999). Thus, irrelevant speech appears to automatically capture attentional resources and should be able to influence visuospatial selection processes. The experimental effect should be stronger for understandable than for ununderstandable speech. This was tested by presenting meaningful (a tape device played forwards, “understandable”) and meaningless (a tape device played backwards, “ununderstandable”) speech in a standard sustained visuospatial attention task.

Section snippets

Subjects

Twelve subjects (four women) with a mean age of 26.4 years (range 21–32 years) participated in the experiment. All were right handed as measured by the Edinburgh Handedness Inventory (Oldfield, 1971) and had normal or corrected-to-normal vision. Two persons were unable to maintain eye fixation and were replaced prior to data analysis.

Stimulus material

The visual stimuli consisted of vertical bars, which subtended 6.5° of visual angle in vertical and 4.2° in horizontal diameter. They were presented in one of two

Results

The ANOVA for the ERP data revealed a significant main effect of the factor attention (P1: F[1,11] = 18.0, p < 0.005, mean square error = 1.0; N1: F[1,11] = 5.99, p < 0.05, mean square error = 3.67). Fig. 1, Fig. 2 display the effects of visuospatial attention on the grand-average waveforms to the non-target stimuli. At occipital scalpsites, spatially attended stimuli elicited enhanced P1 and N1 components’ amplitudes as compared to the unattended condition. The latencies of the P1 and N1 remained

Discussion

The fundamental experimental finding was the crossmodal modulation of visual spatial selective attention mechanisms by task irrelevant speech.

Spatial selective attention led to increased amplitudes of the temporo-occipital P1 and N1 components to attended as compared to unattended visual stimuli. This effect is in line with previous studies and supports the sensory gain control hypothesis that spatial attention modulates neural processes at a perceptual level (e.g., Eason, 1981, Hillyard and

References (47)

  • S. Johannes et al.

    Luminance and spatial attention effects on early visual processing

    Cognitive Brain Research

    (1995)
  • J.J. Lange et al.

    ERP effects of spatial attention and display search with unilateral and bilateral stimulus displays

    Biological Psychology

    (1999)
  • S.J. Luck et al.

    Visual event-related potentials index focused attention with bilateral stimulus arrays. II. Functional dissociation of P1 and N1 components

    Electroencephalography and Clinical Neurophysiology

    (1990)
  • Y.J. Luo et al.

    Cross-modal selective attention to visual and auditory stimuli modulates endogenous ERP components

    Brain Research

    (1999)
  • G.R. Mangun et al.

    Sustained visual-spatial attention produces costs and benefits in response time and evoked neural activity

    Neuropsychologia

    (1998)
  • R.C. Oldfield

    The assessment and analysis of handedness: the Edinburgh Inventory

    Neuropsychologia

    (1971)
  • F. Pavani et al.

    Auditory deficits in visuospatial neglect patients

    Cortex

    (2004)
  • P. Salamé et al.

    Disruption of short-term memory by unattended speech. Implications for the structure of working memory

    Journal of Verbal Learning and Verbal Behaviour

    (1982)
  • W.A. Stephenson et al.

    A balanced non-cephalic reference electrode

    Electroencephalography and Clinical Neurophysiology

    (1951)
  • D.L. Woods et al.

    Intermodal selective attention. I. Effects on event-related potentials to lateralized auditory and visual stimuli

    Electroencephalography and Clinical Neurophysiology

    (1992)
  • R. Desimone et al.

    Neural mechanisms of visual processing in monkeys

  • C.J. Downing

    Expectancy and visual-spatial attention: effects on perceptual quality

    Journal of Experimental Psychology: Human Perception and Performance

    (1988)
  • R.G. Eason

    Visual evoked potential correlates of early neural filtering during selective attention

    Bulletin of the Psychonomic Society

    (1981)
  • View full text