Common and differential electrophysiological mechanisms underlying semantic object memory retrieval probed by features presented in different stimulus types
Introduction
Object knowledge, as a specific form of semantic memory that is essential for interacting with our environments, is represented in multiple sensory, motor, and cognitive semantic subsystems (Allport, 1985, Martin, 2007, Hart et al., 2007). Probing various properties/features of objects has been found to elicit activations in their corresponding modality-specific brain regions, including visual form (shape), visual attribute (color), sound, smell, taste, manipulability, touch, motion, etc. (Martin, 2007, Martin and Chao, 2001, Goldberg et al., 2006, Kellenbach et al., 2001, Kellenbach et al., 2003, Beauchamp et al., 2002, Beauchamp et al., 2003, Noppeney and Price, 2002, Kraut et al., 2002a, Kraut et al., 2006). How different properties of an object (for instance, a cat being an animal, having four legs and fur, and purring) are recalled and integrated to cohere as a single concept remains poorly understood.
A mechanistic account of the processes involved in integrating these multiple representations into a whole, the Neural Hybrid model, has been proposed by Hart and Kraut (2007). Under this model, an object concept is stored on the basis of distinct neural encodings for category-based and/or feature-based semantic knowledge representations that exist in separate subsystems, including various sensory, motor, lexical-semantic, and limbic systems. Activity in these distributed systems is coordinated through interactions between the medial superior frontal cortex (medial BA-6 in the pre-supplementary motor area, pre-SMA), caudate, and thalamus (Hart et al., 2013). We have probed the interactions between these brain regions by using the Semantic Object Retrieval Test (SORT), in which subjects have to decide whether two features result in retrieval of a particular object (Kraut et al., 2002b). The term “feature” here is used to refer to many aspects of object knowledge (e.g., cat), including attributes (tail), action (meow), function (pet), etc. In each trial of the SORT, two features are given, for example, “humps” and “desert”, for subjects to produce an answer; in this case, “camel”. There are also pairs of features that do not typically result in any object memory retrieval, for example, “humps” and “monitor”. The former is called a retrieval trial and the latter a non-retrieval trial. The majority of experimental paradigms targeting semantic memory have used either verification or priming in the context of word associations and semantic relations (Martin, 2007, Kutas and Federmeier, 2000). Most of these tasks do not mandate retrieval of a specific concept (e.g., objects) but are related to processing of meaning in general, reporting on category or semantic relatedness between stimuli (probed as individual words/pictures or in the context of a sentence). The SORT task differs in that participants are required to directly evaluate whether the features result in retrieval of an object memory or not by making an explicit response.
Given that there is a strong emphasis on synchronization of neural activity in the Neural Hybrid model, further clarification of neural mechanisms underlying the retrieval processes requires techniques with sufficient temporal resolution. Scalp EEG, with millisecond resolution, is a non-invasive technique that primarily records the summation of post-synaptic excitatory and inhibitory potentials predominantly from the cortical structures immediately subjacent to the recording electrode. EEG data can be processed to extract event-related spectral perturbation (ERSP) and event-related potentials (ERPs). ERSP examines the spectral decomposition of EEG data, which can dissociate differential effects across multiple frequency bands, each of which may be associated with a particular set of cognitive processes (Cohen, 2014, Delorme and Makeig, 2004). ERP derive from averaging of EEG epochs to capture consistent changes in phase-locked neural activity as reflected in the timing and shape of ERP waveforms (Luck, 2005). To date, several neurophysiological studies using either technique have been performed to examine semantic object memory retrieval during SORT (Ferree et al., 2009, Brier et al., 2008, Chiang et al., 2014, Chiang et al., 2015).
In the previous version of SORT, two features were always presented in the visual word form. Neural mechanisms invoked by previous SORT-based studies may reflect activation of object retrieval through only one stimulus type (i.e., the visual word system) and may not generalize to other presentation modalities (auditory stimuli) or domains (nonverbal stimuli). In our daily lives we receive information from a great variety of formats, and are able to integrate information and extract meanings or identify common objects. For example, even though seeing a picture of a tiger can be very different from reading the word “tiger”, they may both activate overlapping neural representation of the concept. Still, it is far from settled as to whether semantic object representations and their retrieval are subserved by a unitary system or by multiple semantic subsystems (Binder and Desai, 2011, Damasio, 1990, Hart and Gordon, 1992, Patterson et al., 2007). Separate lines of research have supported the existence of a unitary system (Simanova et al., 2014, Lambon Ralph, 2013, Binder et al., 2009) as well as multiple semantic subsystems (Martin, 2007, Martin and Chao, 2001). It may be that both exist, but the degree to which these systems are involved or interact is still debated (Simmons and Martin, 2009, Bonner and Price, 2013, Tsapkini et al., 2011). Multiple semantic subsystems may operate differently as a function of object features (visual color, visual form, touch, Goldberg et al., 2006, Kellenbach et al., 2001, Kellenbach et al., 2003) or the modality in which object features are presented (verbal vs. nonverbal stimuli, visual versus auditory stimuli, Beauchamp et al., 1999, Chao and Martin, 1999). Studies have found that multi-modality input, compared to uni-modality input, results in increased activation in multi-modal processing brain regions or even in primary sensory regions (Senkowski et al., 2008). This multimodal nature of information integration could also occur in semantic integration between multiple semantic subsystems, but neither this integration nor how it affects object memory retrieval has been extensively investigated.
To begin to address these questions, we modified the previous SORT to include two main distinctions in stimulus types: stimulus modality (e.g., visual vs. auditory) and stimulus domain (e.g., verbal vs. nonverbal). In the new SORT paradigm, instead of presenting two visual words simultaneously, features were presented sequentially, one at a time. The first feature was presented in one of the three different stimulus formats: written (visual) words, spoken (auditory) words, or pictures. This was followed by the second feature always presented as a visual word. The effect of how object memory is probed, first by stimulus modality (visual vs. auditory), could then be examined by comparing the visual word to auditory word task, while the effect of stimulus domain (verbal vs. nonverbal) could be examined by comparing the visual word to picture task. In order to examine the neural mechanisms time-locked to both stimulus onset and response, we evaluated EEG responses time-locked to the second stimulus (always a visual word) and to the response on a trial-by-trial basis. Stimulus-locked analysis can dissociate processes involved in attentional and memory integration, while response-locked analysis can dissociated processes involved in accumulation and integration of memory information that will lead to a decision (Werkle-Bergner et al., 2014).
Since electrophysiological responses may contain both evoked (phase-locked) and induced (oscillatory but not phase-locked) neural activity, we used trial-based power spectral analysis, which can report on both types of neural responses (Cohen, 2014, Roach and Mathalon, 2008). This time-frequency power analysis allowed us to detect and evaluate EEG synchronization (increase in power compared to baseline) and desynchronization (decrease in power compared to baseline), that represents coupling and uncoupling, respectively, of multiple neuronal populations that are involved in retrieval of object memory (Pfurtscheller and Lopes, 1999). One prior study using EEG power analysis during the original visual word-only version of SORT (Ferree et al., 2009) showed an early onset long-duration delta synchronization (~ 1 Hz) maximal at both the midline frontal and occipital sites, in retrieval trials compared to non-retrieval trials, suggesting a prolonged search and selection process that leads to successful retrieval (Hart et al., 2013). In addition, later high-beta synchronization (20–35 Hz, after 1 s post-stimulus) was found at frontal midline and left frontal sites, implicating the end of object retrieval. The latter finding corresponds closely to the temporal pattern and spectral characteristics observed via intra-thalamic electrical recordings in Slotnick et al. (2002).
We focused on four EEG frequency bands, based on the results from prior studies that have suggested that EEG signals in these bands reflect processes important in lexical and semantic processing. These frequency bands are delta (1–4 Hz), theta (4–7 Hz), alpha (8–12 Hz) and low beta (13–19 Hz). Overall, alpha and low beta desynchronization have been shown to be associated with retrieval of lexical and semantic information (Bakker et al., 2015, Bastiaansen et al., 2008, Berger et al., 2014, He et al., 2015, Kielar et al., 2014, Li and Yang, 2013, Shahin et al., 2009, Strauß et al., 2014, Willems et al., 2008). Theta synchronization is linked to memory processes involved in lexical and semantic processing as well as in working memory and executive functions during memory retrieval (Bastiaansen et al., 2008, Bakker et al., 2015, Ketz et al., 2014, Li and Yang, 2013, Maguire et al., 2010, Shahin et al., 2009, Strauß et al., 2014). Delta synchronization indexes inhibition of irrelevant processes or attention allocation during cognitive operations, including working memory and semantic tasks (Harmony, 2013, Brunetti et al., 2013, Güntekin and Başar, 2016). We used these measures to detect stimulus-type (modality and domain) dependent similarities and differences in neural responses during semantic memory retrieval.
We hypothesized that if the mechanisms underlying semantic object memory retrieval are supported by multiple subsystems and thus vary with input format, we will observe effects modulated by stimulus type (modality or domain) at the behavioral and/or neural level. Since alpha and beta desynchronization have been linked to semantic memory retrieval processes, we predicted that modality- or domain-dependent effects would be found in these frequency ranges, either in the stimulus-locked or the response-locked analysis. Since the second stimulus was always a visual word, any differential effects between stimulus types would not be readily explained by sensory or perceptual differences in the stimuli. Alternatively, if some underlying mechanisms are unitary and do not vary with the format of input information, we would expect common effects among different stimulus types, independent of input format. These retrieval mechanisms may not be mutually exclusive, and examining these questions is important to understanding how different types of information are channeled via multiple semantic subsystems to activate a coherent object memory representation.
Section snippets
Subjects
Forty-eight young adult human subjects participated, with 16 subjects in each version of the SORT (auditory word: 12 F, Mage = 21.4 years, SD = 2.7; visual word: 10 F, Mage = 21.4 years, SD = 2.9; picture: 13 F, Mage = 23.6 years, SD = 5.1). All were right-handed (Edinburgh Handedness Inventory > 40), native English speakers. Exclusion criteria included a history of neurological or psychiatric disorders, current treatment with psychotropic medications, traumatic brain injury, learning disabilities and
Behavioral results
Group average RT and accuracy were presented in Table 1. For RT, the omnibus 2-way ANOVA revealed a significant main effect of condition, F(1,45) = 41.8, p < 0.001, overall, with retrieval trials (1015 ms) responded to more quickly than non-retrieval trials (1186 ms); the main effect of stimulus type was not significant (p = 0.079). A significant main effect of stimulus type (using 1-way ANOVAs) was found only for non-retrieval trials, F(2,45) = 3.2, p = 0.048, but not for retrieval trials (p = 0.122), which
Discussion
Using a modified version of SORT where we varied the stimulus type of the first object feature followed by a second feature that was always a visual word, we found effects that are modulated by stimulus modality/domain in the first feature and those that are common to all stimulus types underlying semantic object memory retrieval. Behaviorally, non-retrieval trials had longer RT and better accuracy compared to retrieval trials. These effects have been consistently shown in all previous SORT
Acknowledgments
The study was funded by the Berman Research Initiative at the Center for BrainHealth. The authors thank Rajen Patel, Athula Pudhiyidath, and Bambi DeLarosa for their invaluable assistance in data collection and comments.
References (69)
- et al.
I see what you mean: theta power increases are involved in the retrieval of lexical semantic information
Brain Lang.
(2008) - et al.
Parallel visual motion processing streams for manipulable objects and human movements
Neuron
(2002) - et al.
The neurobiology of semantic memory
Trends Cogn. Sci. (Regul. Ed.)
(2011) - et al.
Age-related changes in feature-based object memory retrieval as measured by event-related potentials
Biol. Psychol.
(2014) Category-related recognition defects as a clue to the neural substrates of knowledge
Trends Neurosci.
(1990)- et al.
EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis
J. Neurosci. Methods
(2004) - et al.
Space–time–frequency analysis of EEG data using within-subject statistical tests followed by sequential PCA
NeuroImage
(2009) - et al.
Review of evoked and event-related delta responses in the human brain
Int. J. Psychophysiol
(2016) - et al.
Semantic memory retrieval circuit: role of pre-SMA, caudate, and thalamus
Brain Lang.
(2013) - et al.
The EEG and fMRI signatures of neural integration: an investigation of meaningful gestures and corresponding speech
Neuropsychologia
(2015)
Classification aided analysis of oscillatory signatures in controlled retrieval
NeuroImage
EEG alpha and theta oscillations reflect cognitive and memory performance: a review and analysis
Brain Res. Rev.
Alpha-band oscillations, attention, and controlled access to stored information
Trends Cogn. Sci.
Neural activation during an explicit categorization task: category- or feature-specific effects?
Cogn. Brain Res.
Electrophysiology reveals semantic memory use in language comprehension
Trends Cogn. Sci.
Fast oscillatory dynamics during language comprehension: unification versus maintenance and prediction?
Brain Lang.
How long-term memory and accentuation interact during spoken language comprehension
Neuropsychologia
EEG theta and alpha responses reveal qualitative differences in processing taxonomic versus thematic semantic relationships
Brain Lang.
Semantic memory and the brain: structure and processes
Curr. Opin. Neurobiol.
Gamma- and theta-band synchronization during semantic priming reflect local and long-range lexical–semantic networks
Brain Lang.
Theta power as a marker for cognitive interference
Clin. Neurophysiol.
Retrieval of visual, auditory, and abstract semantics
NeuroImage
Event-related EEG/MEG synchronization and desynchronization: basic principles
Clin. Neurophysiol
Crossmodal binding through neural coherence: implications for multisensory processing
Trends Neurosci.
Brain oscillations during semantic evaluation of speech
Brain Cogn.
Alpha and theta brain oscillations index dissociable processes in spoken word recognition
NeuroImage
Oscillatory brain dynamics associated with the automatic processing of emotion in words
Brain Lang.
Early decreases in alpha and gamma band power distinguish linguistic from visual information during spoken sentence comprehension
Brain Res.
Distributed memory, modular subsystems and dysphasia
Changes in theta and beta oscillations as signatures of novel word consolidation
J. Cogn. Neurosci.
Oscillatory brain activity in the alpha range is modulated by the content of word-prompted mental imagery
Psychophysiology
fMRI responses to video and point-light displays of moving humans and manipulable objects
J. Cogn. Neurosci.
Interacting memory systems-does EEG alpha activity respond to semantic long-term memory access in a working memory task?
Biology (Basel)
Cited by (7)
Neurocognitive and physiological measurment of STEM learning processes
2022, International Encyclopedia of Education: Fourth EditionElectrophysiological signatures of conceptual and lexical retrieval from semantic memory
2021, NeuropsychologiaMultimodal feature binding in object memory retrieval using event-related potentials: Implications for models of semantic memory
2020, International Journal of PsychophysiologyCitation Excerpt :We should reiterate that in our supplemental analysis, we obtained comparable results for all our primary analyses after eliminating RT differences, arguing against the contention that the ERP effects are mainly driven by motor responses (motor confound) or effort level that could be indicated by motor latency (effort confound). Chiang et al. (2016) previously showed differential alpha-band EEG synchronization effects that distinguished auditory processing (auditory-word) from visual processing (visual-word, visual-picture) in the response-locked time-frequency analysis in the right central parietal region. In this current study, we did not pursue response-locked ERP analysis given difficulty in defining the most appropriate baseline activity using ERP.
Artificial memory optimization
2017, Applied Soft Computing JournalCitation Excerpt :For storage of short-term memory, there are two features or limits: one is the instantaneity of depositing, only the information that has been selected and processed sufficiently can enter into long-term memory; the other is the limit of storage capacity (i.e., memory span or STM capacity). The average capacity of short-term memory is 7 ± 2 blocks [54,55], this number is relatively stable Short-term memory encoding consists of acoustic (sound) coding, visual (shape) coding and semantic coding [56,57]. The information that has entered into short-term memory can get better conservation after further being processed, and then transfers into long-term memory.
Differences in electroencephalography oscillations between normal aging and mild cognitive impairment during semantic memory retrieval
2023, European Journal of Neuroscience