Elsevier

Neuropsychologia

Volume 105, October 2017, Pages 243-252
Neuropsychologia

A multisensory perspective on object memory

https://doi.org/10.1016/j.neuropsychologia.2017.04.008Get rights and content

Highlights

  • A single multisensory exposure influences memory for visual and auditory objects.

  • Objects with a task-irrelevant stimulus in another sense were better remembered.

  • Brain discriminates objects according to prior multisensory contexts within 100 ms.

  • We specify aspects of how memory operates on the multisensory nature of objects.

Abstract

Traditional studies of memory and object recognition involved objects presented within a single sensory modality (i.e., purely visual or purely auditory objects). However, in naturalistic settings, objects are often evaluated and processed in a multisensory manner. This begets the question of how object representations that combine information from the different senses are created and utilised by memory functions. Here we review research that has demonstrated that a single multisensory exposure can influence memory for both visual and auditory objects. In an old/new object discrimination task, objects that were presented initially with a task-irrelevant stimulus in another sense were better remembered compared to stimuli presented alone, most notably when the two stimuli were semantically congruent. The brain discriminates between these two types of object representations within the first 100 ms post-stimulus onset, indicating early “tagging” of objects/events by the brain based on the nature of their initial presentation context. Interestingly, the specific brain networks supporting the improved object recognition vary based on a variety of factors, including the effectiveness of the initial multisensory presentation and the sense that is task-relevant. We specify the requisite conditions for multisensory contexts to improve object discrimination following single exposures, and the individual differences that exist with respect to these improvements. Our results shed light onto how memory operates on the multisensory nature of object representations as well as how the brain stores and retrieves memories of objects.

Introduction

Imagine that you are at a cocktail party and you are being introduced by a friend to a group of strangers. Let's call them Sarah, Kim and Deborah. Your friend introduces you and tells the group that you are a cognitive neuroscientist who is visiting town. During the next two minutes, you exchange a few sentences with Sarah. During the same two minutes, you will only see Kim smiling politely when shaking your hand, and you will not happen to hear Deborah introducing herself to you, as someone behind her will shout loudly to his friend standing in the other corner of the room (Fig. 1).1 A week after this cocktail party you are at a different gathering, where you once again see Sarah, Kim, and Deborah. Whose face you will recognise more easily?

Psychophysical, neurophysiological, and human brain imaging research over the last 40 years has greatly advanced our understanding of the cognitive and brain mechanisms that support perception and memory as well as the interactions that they share in everyday situations (Constantinescu et al., 2016, Gazzaley and Nobre, 2012, Mahon and Caramazza, 2011, Summerfield and de Lange, 2014). In such everyday situations, when we encounter a new person or a new object, information about them is typically conveyed by more than a single sense. Indeed, under such multisensory circumstances, profound changes in behaviour and perception can be elicited, and these changes are accompanied by striking changes in the patterns of brain activation and the networks that are engaged. Auditory-visual multisensory processes have been identified throughout functional cortical hierarchies, including primary cortices (reviewed in Murray et al., 2016a), infero-temporal and superior temporal regions (reviewed in Lewis, 2010) for the case of auditory-visual object processing) as well as prefrontal regions (reviewed in Murray and Wallace, 2012). Although much emphasis has been placed on behavioural and perceptual processes, recent work has shown that the presentation of sensory stimuli in a multisensory manner can also have profound effects on our memories, and provide important clues as to why you can recognise Sarah better than her friends on your second meeting in the example provided above. In the current review, we discuss the evidence that multisensory contexts can improve unisensory object discrimination even after a single exposure. We then specify the requisite conditions for such improvements as well as the individual differences therein. Lastly, we place the reviewed behavioural and brain imaging findings within the broader literature on multisensory learning and discuss the importance of considering multisensory contributions when creating accurate models of object perception and memory.

While the experimental paradigm that we have employed has been described in detail previously (Thelen and Murray, 2013; for a summary, see Fig. 2a), we summarise it here briefly. We employed a continuous recognition task, in which on each trial participants have to indicate as quickly and accurately as possible whether they saw a given object for the first (“new”) or second (“old”) time. Across different variations of this paradigm utilised in a number of studies over the years, stimuli within one sense (e.g. vision) would always be task-relevant, while stimuli in another sense (e.g. audition) would always be task-irrelevant. The initial and repeated trials were always equally probable, and across all trial types the number of unisensory and multisensory trials were also equally probably distributed. While it was the case that some of our early work involved a paradigm where multisensory information was only presented on initial trials, subsequent work has replicated effects even when rendering the multisensory content uninformative about the task-relevant dimension (i.e. whether an object was presented for the initial or repeated time). In this line of research, the effectiveness of three distinct multisensory contexts in improving memory has been assessed: 1) a semantically congruent context – where the task-relevant and task-irrelevant stimuli represent the same object (e.g., a drawing of a cow combined with a sound “moo”), 2) a meaningless-association context – where the task-relevant stimuli are paired with tones or noises, and 3) a semantically incongruent context – where the task-relevant and task-irrelevant stimuli represent different objects.

By manipulating the number (and type) of senses actively engaged, the nature of the relationship between the stimuli across the two senses, as well as their task-relevance, our paradigm sought to more closely emulate information processing in naturalistic environments. This evidence (and more recently that from other independent laboratories) has provided novel insights into the behavioural and brain mechanisms guiding memory and information processing in everyday situations. The overall message from these studies is that memory for objects is generally improved when the information is first encountered in a multisensory manner.

Section snippets

Which multisensory contexts improve memory?

In our paradigm, the benefits on object memory of having information presented in a multisensory manner are generally observed as improved discrimination accuracy. Reaction times (RTs) showed no similar benefits (e.g., Lehmann and Murray, 2005). When the initial multisensory presentation (and encoding) involved semantically congruent pairings, robust memory improvements were observed on subsequent retrieval. These improvements were observed across studies employing different stimulus and

Brain correlates of implicit multisensory benefits in memory

The majority of our brain mapping studies has focused on the networks involved in visual memory but all our studies employed the continuous old/new recognition paradigm described above (see Fig. 2b). In this section, we focus exclusively on brain responses elicited by repeated object presentations. Across both ERP and fMRI methods, portions of the lateral occipital cortex (LOC) were found to respond more strongly to naturalistic visual objects that had been initially accompanied by semantically

Individual differences in who benefits from multisensory contexts

Profound inter-individual differences were seen in our paradigm already when the participants were healthy adults. When we analysed the results of our studies involving initial meaningless multisensory contexts in more detail (Thelen et al., 2014), a bimodal distribution of behavioural effects was observed. Specifically, a roughly equal proportion of participants improved as were impaired, both when the task was visual and when it was auditory. Despite differences in timing, the same brain

Cognitive mechanisms by which multisensory contexts improve memory

Before we draw more general conclusions from our findings, we have to note that our paradigm investigates a very specific but ethologically relevant situation. Namely, the task we utilise focuses on episodic memory (Have you seen this object before in this experimental block?), and the effectiveness of the processes underlying this memory system is investigated as a function of multisensory processes that are triggered likely outside of the observer's attentional focus (at least in many of our

Broader implications

The demonstration of benefits of multisensory contexts on memory advances our understanding of both multisensory processes in general as well as of memory and the organisation of semantic knowledge.

First, in terms of implications for multisensory processing, the reviewed findings demonstrate that the products of multisensory processes persist over time, influencing subsequent unisensory object perception. Multisensory processes associated with the initial encounter of an object will influence

Acknowledgements

We thank Antonia Thelen for comments on the initial drafts of the manuscript. PJM receives support from the Pierre Mercier Foundation. MTW receives support from the National Institutes of Health (grants CA183492, DC014114, HD083211, MH109225), from the Simons Foundation Autism Research Initiative and from the Wallace Foundation. MMM receives support from the Swiss National Science Foundation (grants 320030-149982 and 320030-169206 as well as National Centre of Competence in Research project

References (73)

  • M.M. Murray et al.

    The brain uses single-trial multisensory memories to discriminate without awareness

    NeuroImage

    (2005)
  • M.M. Murray et al.

    Plasticity in representations of environmental sounds revealed by electrical neuroimaging

    Neuroimage

    (2008)
  • M.M. Murray et al.

    Neuroplasticity: unexpected consequences of early blindness

    Curr. Biol.

    (2015)
  • M.M. Murray et al.

    Multisensory processes: a balancing act across the lifespan

    Trends Neurosci.

    (2016)
  • H.R. Naghavi et al.

    Cortical regions underlying successful encoding of semantically congruent and incongruent associations between common auditory and visual objects

    Neurosci. Lett.

    (2011)
  • V. Romei et al.

    Preperceptual and stimulus-selective enhancement of low-level human visual cortex excitability by sounds

    Curr. Biol.

    (2009)
  • M.D. Rugg et al.

    Encoding-retrieval overlap in human episodic memory: a functional neuroimaging perspective

    Prog. Brain Res.

    (2008)
  • C. Schmid et al.

    The neural basis of visual dominance in the context of audio-visual object processing

    NeuroImage

    (2011)
  • R.A. Stevenson et al.

    Audiovisual integration in human superior temporal sulcus: inverse effectiveness and the neural processing of speech and object recognition

    NeuroImage

    (2009)
  • A. Thelen et al.

    Electrical neuroimaging of memory discrimination based on single-trial multisensory learning

    NeuroImage

    (2012)
  • A. Thelen et al.

    Single-trial multisensory memories affect later auditory and visual object discrimination

    Cognition

    (2015)
  • A. Amedi et al.

    Early “visual” cortex activation correlates with superior verbal memory performance in the blind

    Nat. Neurosci.

    (2003)
  • A. Baddeley et al.

    Memory

    (2009)
  • M.S. Beauchamp et al.

    Unraveling multisensory integration: patchy organization within human STS multisensory cortex

    Nat. Neurosci.

    (2004)
  • D. Bergerbest et al.

    Neural correlates of auditory repetition priming: reduced fMRI activation in the auditory cortex

    J. Cogn. Neurosci.

    (2004)
  • M.A. Cohen et al.

    Auditory recognition memory is inferior to visual recognition memory

    Proc. Natl. Acad. Sci. USA

    (2009)
  • M. Colombo et al.

    Responses of inferior temporal cortex and hippocampal neurons during delayed matching-to-sample in monkeys (Macaca fascicularis)

    Behav. Neurosci.

    (1994)
  • A.O. Constantinescu et al.

    Organizing conceptual knowledge in humans with a gridlike code

    Science

    (2016)
  • M.J. Crosse et al.

    Eye can hear clearly now: inverse effectiveness in natural audiovisual speech processing relies on long-term crossmodal temporal integration

    J. Neurosci.

    (2016)
  • M. De Lucia et al.

    Perceptual and semantic contributions to repetition priming of environmental sounds

    Cereb. Cortex

    (2010)
  • R. De Meo et al.

    Top-down control and early multisensory processes: chicken vs. egg

    Front. Integr. Neurosci.

    (2015)
  • A.F. Eardley et al.

    Linking multisensory and memory functions in ageing

    Neurobiol. Aging

    (2017)
  • A. Gazzaley et al.

    Top-down modulation: bridging selective attention and working memory

    Trends Cogn. Sci.

    (2012)
  • J.R. Gibson et al.

    Sensory modality speci city of neural activity related to memory in visual cortex

    J. Neurophysiol.

    (1997)
  • G. Gingras et al.

    The differing impact of multsiensroy and unisensory integration on behavior

    J. Neurosci.

    (2009)
  • S. Goulet et al.

    Neural substrates of crossmodal association memory in monkeys: the amygdala versus the anterior rhinal cortex

    Behav. Neurosci.

    (2001)
  • Cited by (54)

    • Epigenetic mechanisms regulate cue memory underlying discriminative behavior

      2022, Neuroscience and Biobehavioral Reviews
      Citation Excerpt :

      Additionally, there is widespread multisensory convergence and integration across brain regions, including sensory-specific areas (reviewed in Driver and Noesselt, 2008). As such, it has been suggested that multisensory stimulation at the time of learning has better memory outcomes than for learning about unimodal sensory stimuli (reviewed in Matusz et al., 2017). As such, epigenetic manipulations that enable unusually persistent object memory may do so by creating a larger memory trace across the multiple sensory modalities that encompass the many various sensory features of that object.

    • Towards understanding how we pay attention in naturalistic visual search settings

      2021, NeuroImage
      Citation Excerpt :

      Semantic relationships are another basic principle of organising information in real-world contexts. Compared to semantically incongruent or meaningless (arbitrary) multisensory stimuli, semantically congruent stimuli are more easily identified and remembered (e.g.(Laurienti et al., 2004) Murray et al., 2004; Doehrmann and Naumer 2008; Chen and Spence, 2010; Matusz et al., 2015a; Tovar et al., 2020; reviewed in ten Oever et al. 2016; Murray et al., 2016b; (Matusz et al., 2017)Matusz et al., 2020) and also, more strongly attended (Matusz et al., 2015b,(Matusz et al., 2019c) , 2019b; reviewed in Soto-Faraco et al., 2019;(Matusz et al., 2019a) ). For example, (Iordanenscu et al., 2008)demonstrated that search for naturalistic objects is faster when accompanied by irrelevant albeit congruent sounds.

    • The role of working memory and visual processing in prototype category learning

      2021, Consciousness and Cognition
      Citation Excerpt :

      Due to the apparently different learning processes and neural correlates of the A/B task and the A/not A task, it was suggested that A/B and A/not A prototype category learning might be mediated by different memory systems. Previous studies have demonstrated that A/B prototype category learning was mediated by the declarative memory system, or the explicit memory system, as it has been found that participants with impaired declarative memory showed deficit in the A/B task (Bozoki et al. 2006; Lech et al., 2016; Lu et al., 2016; Matusz et al., 2017; Moore et al., 2015; Phillips et al., 2017; Zeithamova et al., 2008). For example, it has been found that categorization accuracy for elderly people was significantly worse than young adults in A/B prototype category learning, as the elder people’s declarative memory was worse than the young adults (Glass, Chotibut, Pacheco, Schnyer, & Maddox, 2012).

    View all citing articles on Scopus
    View full text