Elsevier

NeuroImage

Volume 97, 15 August 2014, Pages 224-235
NeuroImage

Visual, auditory and tactile stimuli compete for early sensory processing capacities within but not between senses

https://doi.org/10.1016/j.neuroimage.2014.04.024Get rights and content

Highlights

  • We studied stimulus interactions within and across senses in the EEG.

  • Stimulus streams elicited visual, auditory and tactile steady-state responses.

  • Competing stimuli were briefly presented in the same or in a different sense.

  • Across senses, only competitors in the same sense led to reduced processing.

  • Competition exerted modality-specific influences on early sensory processing.

Abstract

We investigated whether unattended visual, auditory and tactile stimuli compete for capacity-limited early sensory processing across senses. In three experiments, we probed competitive audio-visual, visuo-tactile and audio-tactile stimulus interactions. To this end, continuous visual, auditory and tactile stimulus streams (‘reference’ stimuli) were frequency-tagged to elicit steady-state responses (SSRs). These electrophysiological oscillatory brain responses indexed ongoing stimulus processing in corresponding senses. To induce competition, we introduced transient frequency-tagged stimuli in same and/or different senses (‘competitors’) during reference presentation. Participants performed a separate visual discrimination task at central fixation to control for attentional biases of sensory processing. A comparison of reference-driven SSR amplitudes between competitor-present and competitor-absent periods revealed reduced amplitudes when a competitor was presented in the same sensory modality as the reference. Reduced amplitudes indicated the competitor's suppressive influence on reference stimulus processing. Crucially, no such suppression was found when a competitor was presented in a different than the reference modality. These results strongly suggest that early sensory competition is exclusively modality-specific and does not extend across senses. We discuss consequences of these findings for modeling the neural mechanisms underlying intermodal attention.

Introduction

Imagine standing in a particularly crowded place while listening to someone on your phone. People are walking all over the place; talking out loud; perhaps even bumping into you. You will have to try hard to focus on the caller's voice. This situation illustrates how in everyday life limited neural processing capacities force the human brain to actively select a specific source of information among a manifold of unrelated sensory events (Broadbent, 1952; Kastner and Ungerleider, 2000; Neisser and Becklen, 1975). In our example a distinction can be made between selecting the voice over background noise – a selection within the auditory modality – and selecting auditory information over visual and tactile information – a selection between sensory modalities.

The neural mechanisms underlying attentional selection within a sensory modality have been formalized in the biased competition model (Kastner et al., 1998, Moran and Desimone, 1985). Biased competition rests on two central assumptions: (I) Two or more concurrent stimuli enter a competition for limited processing capacity that leads to mutual suppression. Although primarily established in visual attention research, modality-specific inter-stimulus competition can also be inferred from findings of suppressive effects between auditory (Kawase et al., 2012, Ross et al., 2012) and tactile stimuli (Severens et al., 2010). (II) Selective attention to one stimulus releases it from mutual suppression and biases the competition in favor of the selected stimulus' processing. Following these assumptions, inter-stimulus competition poses a necessary prerequisite for the attentional bias in modality-specific sensory processing.

Attentional selection between sensory modalities, i.e. ‘intermodal’ attention (Alho et al., 1992, Boulter, 1977, Eimer and Schröger, 1998, Porcu et al., 2013), is less well understood. It stands to question, whether the attentional mechanisms that constitute the biased competition framework also account for intermodal attention. Recent neuroimaging studies have investigated crossmodal2 interactions in early sensory processing while participants attended to one sensory modality (Johnson and Zatorre, 2005, Langner et al., 2011, Laurienti et al., 2002, Shomstein and Yantis, 2004). A consistent finding of these studies was that neural activity that corresponded to the processing of input from unattended modalities decreased. This reduction might have been the consequence of a reallocation of processing capacities from unattended to attended sensory modalities. Importantly, such a push–pull mechanism necessarily implies (I) that sensory modalities share common processing capacities and (II) that sensory modalities compete for these common capacities.

However, in the studies described above, participants always attended to stimulation in at least one sensory modality. These situations must have imposed a strong bias on any crossmodal competition when we assume that a biased-competition-like mechanism governs intermodal attention. Participants' attention to visual stimulation, for example, imposed a strong bias towards visual processing while suppressing auditory and/or tactile processing. As a consequence, it has not been addressed to date whether (and how) visual, auditory and tactile processing interacts when neither stimulation is attended. An unbiased crossmodal competition that influences sensory stimulus processing per se has yet to be observed. As laid out above, such crossmodal competitive stimulus interactions would be the vital foundations of a biased-competition-like account of intermodal attention.

The present study aimed to test for crossmodal competition in the absence of an attentional bias. To this end, we conducted three experiments, all employing similar paradigms, yet, each featuring a unique combination of stimuli from two sensory modalities: visual and auditory stimuli in Experiment 1, visual and tactile stimuli in Experiment 2 and tactile and auditory stimuli in Experiment 3. In each experiment we frequency-tagged sensory stimulus streams (‘reference stimuli’) that were presented for several seconds. Frequency-tagged stimuli elicited oscillatory brain responses, phase-locked to stimulation that indexed the ongoing sensory processing in corresponding sensory modalities. These so-called steady-state responses (SSRs; Regan, 1989) have been shown to decrease in amplitude when a competing stimulus was presented in the same sensory modality in vision (Fuchs et al., 2008, Keitel et al., 2010), audition (Kawase et al., 2012, Ross et al., 2012) and touch (Severens et al., 2010). During reference stimulus presentation, we therefore introduced frequency-tagged ‘competitors’, i.e. stimuli of the same and/or different sensory modality, to induce competition. We compared amplitude changes of reference-driven SSRs between competitor-absent and competitor-present periods.

In all three experiments, participants were engaged in a visual discrimination task at central fixation. Participants were instructed to count brief contractions of the fixation cross while ignoring elongations. The task was designed to withdraw participant's attention from task-irrelevant peripheral visual, auditory and tactile reference stimuli and competitors in order to prevent attentional biases of inter-stimulus competition.

We hypothesized that, if, on the one hand, stimuli of different sensory modalities entered a crossmodal competition we would observe effects of suppression. SSR amplitudes during the competitor-present period would be lower than during the competitor-absent period. On the other hand, if no suppression occurred, SSR amplitudes would remain constant. Additionally, in line with previous studies on intra-modal competition, we expected reduced SSR amplitudes during competitor-present periods to indicate suppression between stimuli within senses.

Consistently, across all combinations of stimuli, we found that suppression only occurred between stimuli of the same sensory modality but not between stimuli of different sensory modalities. Therefore, while well in line with biased competition governing processing within senses, our results challenge the notion of a biased competition for common processing capacities between senses.

Section snippets

Participants

Participants gave informed written consent prior to experiments. None reported a history of neurological diseases or injury. The experiments were conducted in accordance with the Declaration of Helsinki and the guidelines of the ethics committee of the University of Leipzig.

Stimuli

In each of the three experiments stimuli from two sensory modalities were presented. Experiment 1 employed visual and auditory stimuli. Experiment 2 featured the presentation of visual and tactile stimuli. In Experiment 3,

Participants

All participants reported normal or corrected-to-normal vision and normal hearing. From a total of twenty recorded participants, data of nineteen (age: 19–33 years, 7 women, all right-handed) entered analyses. Data of one participant was excluded from further analyses because more than 50% of reference-set epochs of each condition contained artifacts.

Stimuli and procedure

Experiment 1 investigated crossmodal interactions between visual and auditory stimuli. Hence, following the general experimental design described

Participants

Data from fifteen participants (age: 19–32 years, 10 women, one left-handed) were recorded and entered analyses. All participants reported normal or corrected-to-normal vision.

Stimuli and procedure

Experiment 2 investigated interactions between visual and tactile stimuli. Visual stimulus presentation was similar to Experiment 1. Vibro-tactile stimuli were generated by four small electromagnetic stimulators (Dancer Design). Two stimulators – carrying the reference stimulus vibration – were attached to the index

Participants

From a total of 17 recorded participants, data of 16 (age: 19–34 years, ten women, all right-handed) entered analyses. Data of one participant was excluded as she/he did not exhibit tSSRs above general noise level. All participants reported normal or corrected-to-normal vision and normal hearing.

Stimuli and procedure

Experiment 3 investigated interactions between auditory and tactile stimuli. Auditory stimulus presentation was similar to Experiment 1. Tactile stimulus presentation was similar to Experiment 2 with two

Discussion

The present study aimed to investigate whether visual, auditory and tactile stimuli compete for capacity-limited early sensory processing across sensory modalities. We conducted three experiments that probed putative audio-visual, visuo-tactile and audio-tactile stimulus interactions, respectively. Frequency-tagged continuous visual, auditory and tactile stimulus streams (‘reference’ stimuli) elicited oscillatory brain responses that indexed ongoing sensory processing in corresponding

Conclusion

We investigated competition for capacity-limited early sensory processing within and between visual, auditory and tactile modalities. In a series of experiments we found that concurrent unrelated and unattended stimuli entered a competition only when they were presented in the same sensory modality. Stimuli from different sensory modalities did not compete for common processing capacities, although we observed cross-modal interactions of other types for specific stimulus combinations. Absent

Acknowledgments

We thank Renate Zahn, Norman Forschack and Christopher Gundlach for data collection. The experimental stimulation was realized using Cogent 2000 developed by the Cogent 2000 team at the Functional Imaging Laboratory and the Institute of Cognitive Neuroscience and Cogent Graphics developed by John Romaya at the Laboratory of Neurobiology at the Wellcome Department of Imaging Neuroscience. Work was supported by the Deutsche Forschungsgemeinschaft (DFG; grant number GRK-1182, Graduate program

References (50)

  • T. Kawase et al.

    Contralateral white noise attenuates 40-Hz auditory steady-state fields but not N100m in auditory evoked fields

    Neuroimage

    (2012)
  • C. Keitel et al.

    Early visual and auditory processing rely on modality-specific attentional resources

    Neuroimage

    (2013)
  • L. Lemus et al.

    Do sensory cortices process more than one sensory modality during perceptual judgments?

    Neuron

    (2010)
  • M.M. Müller et al.

    Effects of spatial selective attention on the steady-state visual evoked potential in the 20–28 Hz range

    Brain Res. Cogn. Brain Res.

    (1998)
  • C. Nangini et al.

    Magnetoencephalographic study of vibrotactile evoked transient and steady-state responses in human somatosensory cortex

    Neuroimage

    (2006)
  • U. Neisser et al.

    Selective looking: Attending to visually specified events

    Cognit. Psychol.

    (1975)
  • E. Porcu et al.

    Concurrent visual and tactile steady-state evoked potentials index allocation of inter-modal attention: a frequency-tagging study

    Neurosci. Lett.

    (2013)
  • M. Severens et al.

    Transient and steady-state responses to mechanical stimulation of different fingers reveal interactions based on lateral inhibition

    Clin. Neurophysiol. Off. J. Int. Fed. Clin. Neurophysiol.

    (2010)
  • D. Talsma et al.

    The multifaceted interplay between attention and multisensory integration

    Trends Cogn. Sci.

    (2010)
  • B.L. Allman et al.

    Not just for bimodal neurons anymore: the contribution of unimodal neurons to cortical multisensory processing

    Brain Topogr.

    (2009)
  • R. Arrighi et al.

    Vision and audition do not share attentional resources in sustained tasks

    Front. Psychol.

    (2011)
  • L.R. Boulter

    Attention and reaction times to signals of uncertain modality

    J. Exp. Psychol. Hum. Percept. Perform.

    (1977)
  • D.E. Broadbent

    Failures of attention in selective listening

    J. Exp. Psychol.

    (1952)
  • C. Cappe et al.

    Heteromodal connections supporting multisensory integration at low levels of cortical processing in the monkey

    Eur. J. Neurosci.

    (2005)
  • A.D. Cate et al.

    Auditory attention activates peripheral visual cortex

    PLoS One

    (2009)
  • Cited by (24)

    • Steady-state visual evoked potentials differentiate between internally and externally directed attention

      2022, NeuroImage
      Citation Excerpt :

      These findings point at a trade-off pattern between attention in two modalities, i.e., attention to one modality decreased the processing of the stimuli presented in the other modality (e.g. Saupe et al., 2009 for visual-auditory; Porcu et al., 2013 for visual-tactile), suggesting one joint pool of attention for different modalities. On the other hand, some studies did not find such an effect (Porcu et al., 2014) or even found the opposite pattern. For example, ssVEP power related to a primary visual stimulus was found to be enhanced when attention was directed to a secondary auditory stimulus compared to when it was directed to a secondary visual stimulus (Talsma et al., 2006).

    • ERP signatures of auditory awareness in cross-modal distractor-induced deafness

      2021, Consciousness and Cognition
      Citation Excerpt :

      Moreover, it has been observed that target processing can be affected by a distractor event occurring in a second modality (Driver & Spence, 1998; Kamke & Harris, 2014; Turatto et al., 2002). However, other findings suggested an important role of modality-specific attentional resources for visual and auditory modality (Keitel et al., 2013; Larsen et al., 2003; Porcu et al., 2014; Talsma et al., 2006). It has been put forward that the recruitment of shared or distinct attentional resources depends on the presence of speeded response instructions, and on whether object-based or visuo-spatial attention is required (Alais et al., 2010; Wahn & König, 2017).

    • Visual load effects on the auditory steady-state responses to 20-, 40-, and 80-Hz amplitude-modulated tones

      2021, Physiology and Behavior
      Citation Excerpt :

      For all bayesian analyses with conclusive evidence (BF > 3), results suggested that manipulations of visual load have no effect on amplitude SmN or ITC SmN to 20-Hz, 40-Hz, or 80-Hz ASSRs. If the present results are correct in that visual load has no effect on ASSRs, they support the idea that auditory processing and visual processing use separate pools of resources [2–4,84,85]. However, the present results do not rule out the possibility that filtering occurs already in the brainstem for all conditions (even during passive viewing).

    • No intermodal interference effects of threatening information during concurrent audiovisual stimulation

      2020, Neuropsychologia
      Citation Excerpt :

      As such, these findings are broadly consistent with a body of electrophysiology studies in humans and monkeys, observing stronger interference by auditory tasks on visual processing, compared to visual-to-auditory interference effects (Bendixen et al., 2010; Mehta et al., 2000). The present findings also mirror the report by Porcu et al. (2014), who, using steady-state potential frequency tagging, found no evidence of intermodal interference exerted by salient transient events, whereas within-modality cost effects were pronounced. There is also the possibility that the null visual-to-auditory intermodal interference effects may have been driven by the lack of shifting attention away from the auditory modality, i.e. that sensory capacity limitations were not reached (Roth et al., 2013).

    • Audio-visual synchrony and spatial attention enhance processing of dynamic visual stimulation independently and in parallel: A frequency-tagging study

      2017, NeuroImage
      Citation Excerpt :

      Each stimulus drove three spectrally distinct SSR components: one at the frequency of stimulus pulsation, another one at twice the pulsation rate and a third following stimulus flicker (i.e., pulse 1f, pulse 2f and flicker frequencies, respectively). SSR power decreased with increasing stimulus presentation rate (main effect SSR component: F(2,22) = 55.76, PGG < 0.001, εGG = 0.90, η2 = 0.301; also see Fig. 3) as has been documented extensively before (Keitel and Müller, 2015; Porcu et al., 2014). Fig. 3c underlines that amplitudes further varied with the allocation of attention towards stimuli (main effect attention: F(1,11) = 24.15, P < 0.001, η2 = 0.094) and were affected by audio-visual synchrony (F(1,11) = 71.01, P < 0.001, η2 = 0.067).

    View all citing articles on Scopus
    1

    Joint first authors/equal contributions stated.

    View full text