A multisensory perspective on object memory
Graphical abstract
Introduction
Imagine that you are at a cocktail party and you are being introduced by a friend to a group of strangers. Let's call them Sarah, Kim and Deborah. Your friend introduces you and tells the group that you are a cognitive neuroscientist who is visiting town. During the next two minutes, you exchange a few sentences with Sarah. During the same two minutes, you will only see Kim smiling politely when shaking your hand, and you will not happen to hear Deborah introducing herself to you, as someone behind her will shout loudly to his friend standing in the other corner of the room (Fig. 1).1 A week after this cocktail party you are at a different gathering, where you once again see Sarah, Kim, and Deborah. Whose face you will recognise more easily?
Psychophysical, neurophysiological, and human brain imaging research over the last 40 years has greatly advanced our understanding of the cognitive and brain mechanisms that support perception and memory as well as the interactions that they share in everyday situations (Constantinescu et al., 2016, Gazzaley and Nobre, 2012, Mahon and Caramazza, 2011, Summerfield and de Lange, 2014). In such everyday situations, when we encounter a new person or a new object, information about them is typically conveyed by more than a single sense. Indeed, under such multisensory circumstances, profound changes in behaviour and perception can be elicited, and these changes are accompanied by striking changes in the patterns of brain activation and the networks that are engaged. Auditory-visual multisensory processes have been identified throughout functional cortical hierarchies, including primary cortices (reviewed in Murray et al., 2016a), infero-temporal and superior temporal regions (reviewed in Lewis, 2010) for the case of auditory-visual object processing) as well as prefrontal regions (reviewed in Murray and Wallace, 2012). Although much emphasis has been placed on behavioural and perceptual processes, recent work has shown that the presentation of sensory stimuli in a multisensory manner can also have profound effects on our memories, and provide important clues as to why you can recognise Sarah better than her friends on your second meeting in the example provided above. In the current review, we discuss the evidence that multisensory contexts can improve unisensory object discrimination even after a single exposure. We then specify the requisite conditions for such improvements as well as the individual differences therein. Lastly, we place the reviewed behavioural and brain imaging findings within the broader literature on multisensory learning and discuss the importance of considering multisensory contributions when creating accurate models of object perception and memory.
While the experimental paradigm that we have employed has been described in detail previously (Thelen and Murray, 2013; for a summary, see Fig. 2a), we summarise it here briefly. We employed a continuous recognition task, in which on each trial participants have to indicate as quickly and accurately as possible whether they saw a given object for the first (“new”) or second (“old”) time. Across different variations of this paradigm utilised in a number of studies over the years, stimuli within one sense (e.g. vision) would always be task-relevant, while stimuli in another sense (e.g. audition) would always be task-irrelevant. The initial and repeated trials were always equally probable, and across all trial types the number of unisensory and multisensory trials were also equally probably distributed. While it was the case that some of our early work involved a paradigm where multisensory information was only presented on initial trials, subsequent work has replicated effects even when rendering the multisensory content uninformative about the task-relevant dimension (i.e. whether an object was presented for the initial or repeated time). In this line of research, the effectiveness of three distinct multisensory contexts in improving memory has been assessed: 1) a semantically congruent context – where the task-relevant and task-irrelevant stimuli represent the same object (e.g., a drawing of a cow combined with a sound “moo”), 2) a meaningless-association context – where the task-relevant stimuli are paired with tones or noises, and 3) a semantically incongruent context – where the task-relevant and task-irrelevant stimuli represent different objects.
By manipulating the number (and type) of senses actively engaged, the nature of the relationship between the stimuli across the two senses, as well as their task-relevance, our paradigm sought to more closely emulate information processing in naturalistic environments. This evidence (and more recently that from other independent laboratories) has provided novel insights into the behavioural and brain mechanisms guiding memory and information processing in everyday situations. The overall message from these studies is that memory for objects is generally improved when the information is first encountered in a multisensory manner.
Section snippets
Which multisensory contexts improve memory?
In our paradigm, the benefits on object memory of having information presented in a multisensory manner are generally observed as improved discrimination accuracy. Reaction times (RTs) showed no similar benefits (e.g., Lehmann and Murray, 2005). When the initial multisensory presentation (and encoding) involved semantically congruent pairings, robust memory improvements were observed on subsequent retrieval. These improvements were observed across studies employing different stimulus and
Brain correlates of implicit multisensory benefits in memory
The majority of our brain mapping studies has focused on the networks involved in visual memory but all our studies employed the continuous old/new recognition paradigm described above (see Fig. 2b). In this section, we focus exclusively on brain responses elicited by repeated object presentations. Across both ERP and fMRI methods, portions of the lateral occipital cortex (LOC) were found to respond more strongly to naturalistic visual objects that had been initially accompanied by semantically
Individual differences in who benefits from multisensory contexts
Profound inter-individual differences were seen in our paradigm already when the participants were healthy adults. When we analysed the results of our studies involving initial meaningless multisensory contexts in more detail (Thelen et al., 2014), a bimodal distribution of behavioural effects was observed. Specifically, a roughly equal proportion of participants improved as were impaired, both when the task was visual and when it was auditory. Despite differences in timing, the same brain
Cognitive mechanisms by which multisensory contexts improve memory
Before we draw more general conclusions from our findings, we have to note that our paradigm investigates a very specific but ethologically relevant situation. Namely, the task we utilise focuses on episodic memory (Have you seen this object before in this experimental block?), and the effectiveness of the processes underlying this memory system is investigated as a function of multisensory processes that are triggered likely outside of the observer's attentional focus (at least in many of our
Broader implications
The demonstration of benefits of multisensory contexts on memory advances our understanding of both multisensory processes in general as well as of memory and the organisation of semantic knowledge.
First, in terms of implications for multisensory processing, the reviewed findings demonstrate that the products of multisensory processes persist over time, influencing subsequent unisensory object perception. Multisensory processes associated with the initial encounter of an object will influence
Acknowledgements
We thank Antonia Thelen for comments on the initial drafts of the manuscript. PJM receives support from the Pierre Mercier Foundation. MTW receives support from the National Institutes of Health (grants CA183492, DC014114, HD083211, MH109225), from the Simons Foundation Autism Research Initiative and from the Wallace Foundation. MMM receives support from the Swiss National Science Foundation (grants 320030-149982 and 320030-169206 as well as National Centre of Competence in Research project
References (73)
- et al.
Print-specific multimodal brain activation in kindergarten improves prediction of reading skills in second grade
Neuroimage
(2013) - et al.
Behavioral, perceptual, and neural alterations in sensory and multisensory function in autism spectrum disorder
Prog. Neurobiol.
(2015) - et al.
Semantics and the multisensory brain: how meaning modulates processes of audio-visual integration
Brain Res.
(2008) - et al.
Visual perceptual learning in human object recognition areas: a repetition priming study using high-density electrical mapping
NeuroImage
(2001) - et al.
Remembrance of odors past: human olfactory cortex in cross-modal recognition memory
Neuron
(2004) - et al.
The role of multisensory memories in unisensory object discrimination
Cogn. Brain Res.
(2005) - et al.
What drivves the organization of object knowledge in the brain? The distributed domain-specific hypothesis
Trends Cogn. Sci.
(2011) - et al.
The context-contingent nature of cross-modal activations of the visual cortex
NeuroImage
(2016) - et al.
Perceptual-mnemonic functions of the perirhinal cortex
Trends Cogn. Sci.
(1999) - et al.
Rapid discrimination of visual and multisensory memories revealed by electrical neuroimaging
NeuroImage
(2004)
The brain uses single-trial multisensory memories to discriminate without awareness
NeuroImage
Plasticity in representations of environmental sounds revealed by electrical neuroimaging
Neuroimage
Neuroplasticity: unexpected consequences of early blindness
Curr. Biol.
Multisensory processes: a balancing act across the lifespan
Trends Neurosci.
Cortical regions underlying successful encoding of semantically congruent and incongruent associations between common auditory and visual objects
Neurosci. Lett.
Preperceptual and stimulus-selective enhancement of low-level human visual cortex excitability by sounds
Curr. Biol.
Encoding-retrieval overlap in human episodic memory: a functional neuroimaging perspective
Prog. Brain Res.
The neural basis of visual dominance in the context of audio-visual object processing
NeuroImage
Audiovisual integration in human superior temporal sulcus: inverse effectiveness and the neural processing of speech and object recognition
NeuroImage
Electrical neuroimaging of memory discrimination based on single-trial multisensory learning
NeuroImage
Single-trial multisensory memories affect later auditory and visual object discrimination
Cognition
Early “visual” cortex activation correlates with superior verbal memory performance in the blind
Nat. Neurosci.
Memory
Unraveling multisensory integration: patchy organization within human STS multisensory cortex
Nat. Neurosci.
Neural correlates of auditory repetition priming: reduced fMRI activation in the auditory cortex
J. Cogn. Neurosci.
Auditory recognition memory is inferior to visual recognition memory
Proc. Natl. Acad. Sci. USA
Responses of inferior temporal cortex and hippocampal neurons during delayed matching-to-sample in monkeys (Macaca fascicularis)
Behav. Neurosci.
Organizing conceptual knowledge in humans with a gridlike code
Science
Eye can hear clearly now: inverse effectiveness in natural audiovisual speech processing relies on long-term crossmodal temporal integration
J. Neurosci.
Perceptual and semantic contributions to repetition priming of environmental sounds
Cereb. Cortex
Top-down control and early multisensory processes: chicken vs. egg
Front. Integr. Neurosci.
Linking multisensory and memory functions in ageing
Neurobiol. Aging
Top-down modulation: bridging selective attention and working memory
Trends Cogn. Sci.
Sensory modality speci city of neural activity related to memory in visual cortex
J. Neurophysiol.
The differing impact of multsiensroy and unisensory integration on behavior
J. Neurosci.
Neural substrates of crossmodal association memory in monkeys: the amygdala versus the anterior rhinal cortex
Behav. Neurosci.
Cited by (54)
How do irrelevant stimuli from another modality influence responses to the targets in a same-different task
2023, Consciousness and CognitionEpigenetic mechanisms regulate cue memory underlying discriminative behavior
2022, Neuroscience and Biobehavioral ReviewsCitation Excerpt :Additionally, there is widespread multisensory convergence and integration across brain regions, including sensory-specific areas (reviewed in Driver and Noesselt, 2008). As such, it has been suggested that multisensory stimulation at the time of learning has better memory outcomes than for learning about unimodal sensory stimuli (reviewed in Matusz et al., 2017). As such, epigenetic manipulations that enable unusually persistent object memory may do so by creating a larger memory trace across the multiple sensory modalities that encompass the many various sensory features of that object.
Towards understanding how we pay attention in naturalistic visual search settings
2021, NeuroImageCitation Excerpt :Semantic relationships are another basic principle of organising information in real-world contexts. Compared to semantically incongruent or meaningless (arbitrary) multisensory stimuli, semantically congruent stimuli are more easily identified and remembered (e.g.(Laurienti et al., 2004) Murray et al., 2004; Doehrmann and Naumer 2008; Chen and Spence, 2010; Matusz et al., 2015a; Tovar et al., 2020; reviewed in ten Oever et al. 2016; Murray et al., 2016b; (Matusz et al., 2017)Matusz et al., 2020) and also, more strongly attended (Matusz et al., 2015b,(Matusz et al., 2019c) , 2019b; reviewed in Soto-Faraco et al., 2019;(Matusz et al., 2019a) ). For example, (Iordanenscu et al., 2008)demonstrated that search for naturalistic objects is faster when accompanied by irrelevant albeit congruent sounds.
The role of working memory and visual processing in prototype category learning
2021, Consciousness and CognitionCitation Excerpt :Due to the apparently different learning processes and neural correlates of the A/B task and the A/not A task, it was suggested that A/B and A/not A prototype category learning might be mediated by different memory systems. Previous studies have demonstrated that A/B prototype category learning was mediated by the declarative memory system, or the explicit memory system, as it has been found that participants with impaired declarative memory showed deficit in the A/B task (Bozoki et al. 2006; Lech et al., 2016; Lu et al., 2016; Matusz et al., 2017; Moore et al., 2015; Phillips et al., 2017; Zeithamova et al., 2008). For example, it has been found that categorization accuracy for elderly people was significantly worse than young adults in A/B prototype category learning, as the elder people’s declarative memory was worse than the young adults (Glass, Chotibut, Pacheco, Schnyer, & Maddox, 2012).