Elsevier

NeuroImage

Volume 70, 15 April 2013, Pages 258-267
NeuroImage

Auditory modulation of visual stimulus encoding in human retinotopic cortex

https://doi.org/10.1016/j.neuroimage.2012.12.061Get rights and content
Under a Creative Commons license
open access

Abstract

Sounds can modulate visual perception as well as neural activity in retinotopic cortex. Most studies in this context investigated how sounds change neural amplitude and oscillatory phase reset in visual cortex. However, recent studies in macaque monkeys show that congruence of audio-visual stimuli also modulates the amount of stimulus information carried by spiking activity of primary auditory and visual neurons. Here, we used naturalistic video stimuli and recorded the spatial patterns of functional MRI signals in human retinotopic cortex to test whether the discriminability of such patterns varied with the presence and congruence of co-occurring sounds. We found that incongruent sounds significantly impaired stimulus decoding from area V2 and there was a similar trend for V3. This effect was associated with reduced inter-trial reliability of patterns (i.e. higher levels of noise), but was not accompanied by any detectable modulation of overall signal amplitude. We conclude that sounds modulate naturalistic stimulus encoding in early human retinotopic cortex without affecting overall signal amplitude. Subthreshold modulation, oscillatory phase reset and dynamic attentional modulation are candidate neural and cognitive mechanisms mediating these effects.

Highlights

► Multivariate decoding of video identity from fMRI signals in V1V3. ► Decoding accuracy in V2 is significantly reduced for incongruent sounds. ► Reduced decoding accuracy is associated with reduced inter-trial reliability. ► No modulation of univariate signal amplitude by sounds. ► Noise levels in sensory areas are affected by multisensory congruence.

Keywords

Multisensory
Audio-visual
V2
Decoding
MVPA
fMRI

Cited by (0)