Abstract
Shared attention experiments examine the potential differences in function or behavior when stimuli are experienced alone or in the presence of others, and when simultaneous attention of the participants to the same stimulus or set is involved. Previous work has found enhanced reactions to emotional stimuli in social situations, yet these changes might represent enhanced communicative or motivational purposes. This study examines whether viewing emotional stimuli in the presence of another person influences attention to or memory for the stimulus. Participants passively viewed emotionally-valenced stimuli while completing another task (counting flowers). Each participant performed this task both alone and in a shared attention condition (simultaneously with another person in the same room) while EEG signals were measured. Recognition of the emotional pictures was later measured. A significant shared attention behavioral effect was found in the attention task but not in the recognition task. Compared to event-related potential responses for neutral pictures, we found higher P3b response for task relevant stimuli (flowers), and higher Late Positive Potential (LPP) responses for emotional stimuli. However, no main effect was found for shared attention between presence conditions. To conclude, shared attention may therefore have a more limited effect on cognitive processes than previously suggested.
Similar content being viewed by others
Introduction
Humans are social animals that often prefer acting together, rather than alone1,2. Potential differences in function or behavior when experienced alone or in the presence of others are referred to as co-presence effects that can be either generated from the mere presence of another or from a shared experience that occurs in the physical or psychological presence of another person3,4,5. When simultaneous attention of the participants to the same stimulus or set is involved, shared experience overlaps with the term shared attention4,6. Indeed, both shared experience and shared attention have been found to affect cognitive function and behavior, such as attention to a target stimuli3,7, social learning8, memory9,10, motivation11, judgment12 and inhibitory control13. Specifically, shared attention has been found to amplify these processes (etc. memories, emotions, and behavioral learning)4,6. For example, higher manifestation of social learning was found in the shared attention condition compared to an unshared condition8.
Several5,14,15 studies have examined the effect of shared attention in emotional contexts, but yielded mixed results. For example, Shteynberg and colleagues examined online shared attention effects on reactions to emotional stimuli. In a series of studies, they presented either short clips or pictures to participants and found enhancement of the subjective feelings (self-reported) for emotional stimuli in the online presence of another person. Specifically, scary advertisements led to higher “scariness” scores in the shared attention condition compared to the same stimuli viewed alone. Similarly, in response to positively and negatively-valenced pictures (the International Affective Picture System; IAPS16), self-reported feelings were magnified in the direction of the attended object’s valence in the shared attention condition. Such effects were not found for neutrally-valenced images12. Similarly, in an fMRI study by Wagner et al.5, participants believed that in some of the trials a friend was also viewing the same stimuli (IAPS pictures). The authors found increased fMRI activation in the brain reward system and in prefrontal areas such as the dorsolateral prefrontal cortex (DLPFC) for the shared viewing condition. The authors suggested that shared emotional experience is rewarded by brain activation that strengthens a motivational tendency to bond and affiliate with others in emotional situations5.
Along the same lines, Boothby, Clark, & Bargh examined the effect of shared attention in a simple chocolate tasting test, by having participants taste the chocolate either simultaneously with another participant (“shared” condition) or alone yet in the presence of another participant (“unshared” condition). Their results show that shared attention magnifies the perceived pleasantness/ unpleasantness taste of a chocolate. These results led the authors to suggest the Emotional Amplification Theory, according to which experiences are amplified when shared and mediated by presentation of the stimuli in the shared condition6.
However, other studies examining the effect of shared experience have not been consistent with this theoretical concept17,18. For example, Fridlund18 found that positively-valenced stimuli elicited smiling in participants that varied monotonically with the perceived sociality of the shared experience conditions (shared experience, audience or alone); however, this effect was not related to the subjective emotion reported by the participants. Another study, which measured face expressions and subjective emotions of participants while they watched a sad movie alone or in a shared condition, found changes in the frequency of sad expressions between presence conditions; however, there was no relation between facial expressions and the reported subjective feelings17.
Recently, Jolly et al.14 suggested that such behavioral changes represent enhanced communicative or motivational behavior rather than an elevated subjective experience in co-presence situations. In a series of eight experiments, they failed to find an effect of shared attention on participants’ emotional experience while they watched emotional video clips, although participants seemed to value shared experiences and were motivated to engage in them. This result remained constant among various experimental setups, which manipulated participants’ physical co-presence or shared experience14.
Nonetheless, it seems that shared attention can alter memory for stimuli19. For example, several studies examined the effect of shared attention on memory, and showed higher accuracy rates as well as shorter response times in recognizing stimuli presented in the shared condition10,20. Furthermore, several studies showed that performing a task with a joint setup (having stimuli assigned to each participant from a different category or type) enhances attention to stimuli from the partners’ category more than other task-irrelevant stimuli, in addition to stimuli of ones’ own category7,9,21. This effect was found in accordance to the psychological distance between participants – there was a larger shared-attention effect for participants that saw stimuli on the same screen than those who saw them separately20.
These mixed results call for a better understating of the effect of shared attention on one’s cognition and behavior. Moreover, while previous studies investigated the effects of shared attention on the subjective emotional reaction, none of these studies address the potential effect of shared attention on cognitive processing, and specifically on attention and memory for emotional stimuli. Furthermore, examining shared attention in a physical co-presence setup, using an objective measure, rather than a subjective self-report may provide a more accurate estimation of the shared attention effect.
To this end, the current study examined the effect of shared attention on attention to and memory for emotional stimuli (IAPS pictures;16), in a set-up using both behavioral measures and a dual-electroencephalogram (EEG) recording. Participants were first asked to count rare stimuli (pictures of flowers) that were interspersed among emotionally-valenced IAPS pictures with either positive, negative or neutral valence. The use of the IAPS pictures enables one to control for both the valence and the arousal of stimuli and counterbalance them between experimental sets16,22,23. Each participant performed this task twice: alone and simultaneously with another person in the room (shared condition), while EEG was recorded.
It has been previously suggested that shared attention may alter the underlying cognitive resources4,5. Although behavioral outcomes alone can be informative in examining the question of whether shared attention enhances attention and memory to emotionally valenced stimuli, it is limited to the participant’s explicit report (or sometimes reaction time measures). In the current study, EEG was used to examine the potential implicit effect of shared attention on attention to emotional stimuli. EEG allows for precise temporal resolution that can reveal differences between early and late processing of emotional stimuli24, which can complement behavioral measures.
For this purpose, we analyzed two Event-Related Potential (ERP) components time-locked to the presentation of stimuli during the flower-counting task: the Late Positive Potential (LPP) and P3b. The LPP is a positive deflection in EEG amplitude elicited by visual stimulus perception. The LPP typically arises over parietal sites, occurs 400–800 ms post-stimulus presentation, and is enhanced for emotionally salient relative to neutral stimuli23,25,26,27. The P3b, a positive deflection with maximal amplitude over centro-parietal scalp electrodes around 350-600 ms post-stimulus, is correlated with attention to task-related rare, yet anticipated, stimuli28,29,30. The P3b was calculated for flower stimuli and the LPP was calculated for emotionally-valenced stimuli, while neutral-valenced stimuli used as a control condition for both.
In addition, mu rhythms (8–13 Hz) are established EEG components associated with social attention, acquired using EEG in Centro-parietal areas31,32,33, and thus constitute a useful neural measure for the brain systems implicated in shared attention. When executing or observing actions, sensorimotor mu rhythms show a desynchronization of activity, as reflected in suppressed power relative to pre-action levels34,35,36. Moreover, greater mu suppression is found in social compared to non-social contexts, such as when one is participating in a social interaction37,38, deciphering the intentions of others from biological motion39,40, and understanding facial expressions41, or levels of pain36 of others. Note that apart from mu suppression, alpha suppression, measured in the same frequency range, but over parietal-occipital areas has been repeatedly shown to be affected by visual attention31,36,42,43,44. Both alpha and mu suppression can serve as useful measures of neural changes occurring as a result of shared attention.
We hypothesized that attention to and memory for stimuli will be magnified in the shared condition, leading to higher accuracy rates in the flower counting task (enhanced attention to the target stimuli) and to better performance in the recognition task (indirectly linked with attention to emotional stimuli during the counting flowers task). In addition, we predicted heightened ERP responses, manifested as increased positivity over parietal electrodes in response to targets (P3b) and to emotional IAPS stimuli (LPP). Lastly, in line with previous studies we expected to find stronger alpha suppression in the shared condition, indicating increased attentional processing45 as well as stronger mu suppression, indicating more social involvement, as was previously found in response to social context37,38.
Results
Flower-counting Task: Attention to Target Stimuli
Behavioral Differences between Conditions
A significant difference in flower counting was found in accuracy rates between the two conditions (alone/shared), such that accuracy was higher when the task was performed alone [N = 40; M ± SD; alone = 94.35 ± 7.5, shared = 86.25 ± 7.9; t(39) = 5.62, p = 0.00, d = 1.04, BF10 = 10254.28]) (Fig. 1a).
P3b
To examine differences in responses to target stimuli (flowers), we compared responses to target images with responses to neural IAPS images, by conducting a 2×2 ANOVA [condition (alone, shared) × stimulus type (target (flowers), non-target (neutral IAPS)], and a comparable Bayesian ANOVA. There were no statistically significant differences between the two conditions [N = 38; M ± SD (µv); 2.1 ± 0.21, 2.38 ± 0.18 for alone and shared, respectively; F(1,37) = 2.45, p = 0.126, ηp2 = 0.06, BF10 = 0.27]. However, there was a significant main effect for stimulus type, revealing the expected larger P3b response to target compared to non-target stimuli [target = 3.44 ± 0.24, non-target = 1.05 ± 0.18; F(1,37) = 100.98, p = 0.00, ηp2 = 0.73, BF10 = 2.16 ×1020]. The interaction between condition and stimulus type was not significant [F(1,37) = 3.1, p = 0.087, ηp2 = 0.07, BF10 = 0.44] (Fig. 1b,c).
A permutation analysis which was conducted in order to control for the difference in the number of segments for each stimulus type yielded similar results for condition [F(1,37) = 0.71, p = 0.40, CIF [0.18, 2.93]], stimulus type [F(1,37) = 71, p = 0.00, CIF [60.52, 88.19]] and interaction [F(1,37) = 0.03, p = 0.85, CIF [0, 2.56]].
Flower-counting Task: Attention to Emotional Stimuli
LPP
We conducted a 2×3 ANOVA [condition × valence (neutral/negative/positive)] for the LPP amplitudes recorded during the flower counting task, as well as a comparable Bayesian ANOVA. We found a significant valence effect, reflecting increased neural LPP responses for stimuli with negative (low valence) compared to positive (high valence) or neutral images [N = 38; ERP amplitude: M ± SD (µv); neutral = 0.92 ± 0.14, negative = 1.54 ± 0.2, positive = 1.05 ± 0.18; F(1.56,57.78) = 19.18, p = 0.00, ηp2 = 0.34, BF10 = 2.27 × 107]. However, there was no significant effect of condition [M ± SD (µv); alone = 1.1 ± 0.17, shared = 1.23 ± 0.17; F(1,37) = 1.77, p = 0.19, ηp2 = 0.046, BF10 = 0.39]. Additionally, the interaction between condition and valence was not significant [F(1.96,73.12) = 2.44, p = 0.09, ηp2 = 0.06, BF10 = 0.2] (Fig. 2a). We also calculated LPP mean activity in a posterior-parietal electrode composition (C1, C2, CP1, CP2, Cz, CPz/Pz) and a control analysis in frontal sites (Fpz, Fp1, Fp2). In both cases we got similar results to the first analysis (see Supplementary Information).
Alpha and Mu Rhythms
As alpha band power tends to differ significantly between participants36,46, we first normalized the data using a logarithmic scale (ln(mean activity)). We analyzed mu activity (8–13 Hz) over central areas (electrodes C3 and C4) as a measure of social attendance31 (Fig. 3a). There was no significant effect of condition on mu activity, confirmed via both ANOVA and Bayesian ANOVA [N = 38; alone = −0.27 ± 0.06, shared = −0.27 ± 0.06; F < 0.01, p = 0.96, ηp2 < 0.001, BF10 = 0.14]. The valence was also non-significant [neutral = −0.28 ± 0.06, negative = −0.275 ± 0.06, positive = −0.27 ± 0.06; F(1.93,74) = 1.22, p = 0.3, ηp2 = 0.03, BF10 = 0.05], as well as the condition X valence interaction [F(1.68,62.33) = 2.446, p = 0.1, ηp2 = 0.06, BF10 = 0.125].
Next, we analyzed alpha power (8–13 Hz) over occipital cortex (electrodes O1, O2 and Oz) as an additional measure of visual attention42 (Fig. 3c). There was no significant effect of condition on alpha rhythm, confirmed via both ANOVA and Bayesian ANOVA [N = 38; alone = 0.11 ± 0.04, shared = 0.135 ± 0.04; F < 1; F(1,37) = 0.76, p = 0.39, ηp2 = 0.02, BF10 = 0.64]. The valence effect was also non-significant [neutral = 0.125 ± 0.04, negative = 0.124 ± 0.04, positive = 0.124 ± 0.04; F < 1; F(1.77,65.7) = 0.003, p = 0.99, ηp2 < 0.001, BF10 = 0.045]. Although there was a significant condition X valence interaction [F(1.65,61.3) = 4.66, p = 0.018, ηp2 = 0.11], we found no support for it with Bayesian analysis accounts [BF10 = 0.2] nor simple effects between conditions (p > 0.05). As topographies hinted to a left laterality effect, we also computed a laterality index (C3-C4), and found no effect of shared attention in mu suppression activity [alone = −0.286 ± 0.11, shared = −0.253 ± 0.12; F(1,38) = 0.15, p = 0.69, ηp2 < 0.01] or valence [F(1.47,56.06) = 0.12, p = 0.88, ηp2 < 0.01] in this measure as well.
Recognition Task: Memory for Emotional Stimuli
All subjects were better than chance in performing the task (i.e. had more than 55% success across conditions). To test whether shared attention manipulation (alone/shared) affected recognition of emotional pictures, we calculated d’ as a measure of recognition accuracy (see Methods). We found no main effect for condition [N = 42; F < 1; F(1,41) = 0.0, p = 1, ηp2 < 0.001, BF10 = 0.13] or for valence [neutral = 1.63 ± 0.87, negative = 1.6 ± 0.07, positive = 1.48 ± 0.09; F(1.94) = 2.47, p = 0.09, ηp2 = 0.05, BF10 = 0.61], nor a significant interaction between condition and valence [F < 1, F(1.78,73) = 0.23, p = 0.77, ηp2 < 0.001, BF10 = 0.08] (See Fig. 2d).
Discussion
In the current study, we examined the effect of shared attention on attention to and recognition of emotional stimuli. Behaviorally, we found higher accuracy rates for the task when performed alone, compared to when it was performed in the presence of another person. At the neural level, we found no significant differences between conditions in either the P3b responses or alpha / mu band activity. We also found no differences in LPP responses to IAPS pictures. Lastly, we found no difference in recognition of emotional pictures as a function of shared attention.
Considering attention to the target stimuli first, the only solid shared attention effect found in our study was that of higher accuracy rates in the flower counting task when performed alone compared to when it was performed simultaneously with another person. This finding, although counter-intuitive, is supported by previous work suggesting that the shared condition may increase arousal47 or distract participants48, thus may impair performance compared to when the task is performed alone. Note that this effect was weakly reflected in the P3b neural response, showing the expected higher amplitude for target stimuli, and higher yet non-significant response in the shared condition, supporting increased arousal during the task.
The ERPs showed the expected heightened LPP responses to negative stimuli compared to neutral or positive ones25,26,49. However, shared attention had no effect on neural measures or on memory for emotional stimuli (as measured behaviorally). Indeed, previous work has yielded mixed results regarding the effects of emotional stimuli when manipulating the presence of others. While in some studies, participants reported enhanced feelings in the shared condition6,12, other studies found that shared experience tinted the subjective feelings more positive regardless of the stimulus valence (positive/ negative)5, reported an effect on behavioral measures (e.g., facial expressions) with no concurrent effect in self-reported emotions17,18, or did not find such an effect at all14. Our results echo the latter results and suggest that shared attention may have a more limited effect than previously suggested.
A potential account for the lack of shared attention effects may be due to the nature of the task itself. While we measured behavioral and neural differences in each valence condition, participants were not explicitly asked about their subjective feelings in response to the stimuli. Thus, we cannot rule out a potential shared attention effect on subjective emotional feelings, such as those reported by Wagner and colleagues5 and by Shteynberg et al.12. Thus, examining the relation between behavior, subjective “feelings” and neuronal responses is yet to be studied. An alternative explanation for the lack of an effect, is our choice of regions of interest, both in terms of EEG frequencies and ERP components. We chose a hypothesis driven approach, and specifically examined components based on prior research, using a sensor-driven approach. It is possible that a data-driven approach (e.g. component level analysis) would yield different results.
Our study had several methodological drawbacks that may limit the generalizability of its findings. While participants were recruited randomly in time slots of two, gender was not taken into account. This led to heterogeneity of the couples at the shared condition, as some of the couples were with the same gender and others were mixed. Although a recent study found no effect of gender on perception of IAPS pictures with social cues50, having couples with mixed gender may influence participants’ response to the stimuli51,52, may cause differences in arousal53 or create an implied in-group/out-group effect. Since the current study is underpowered to examine differences between same gender and mixed pairs, we encourage future studies to take this aspect under further investigation.
In addition, our only behavioral measure was accuracy rates, and reaction times (RTs) were not taken into account. It is possible that a shared attention effect, while missing from accuracy data, may still be revealed in RTs (or the occurrence of a speed-accuracy tradeoff).
In sum, our study provides a further examination of the role of shared attention on attention and recognition of emotional stimuli and suggests that it may be more restricted than previously described. We found no shared attention effects on memory for emotional stimuli, and no effect on relevant EEG measures of attention.
Methods
Participants
Forty-two undergraduate English-speaking students from the University of California Berkeley participated in the study. Participants’ age range was 18–38 years (M = 22.4 years, SD = 4.5), 22 participants were female, 6 were left-handed (self-reported) and 3 reported being ambidextrous. All participants reported normal or corrected to normal visual acuity and had no history of psychiatric or neurological disorders as confirmed by a screening interview. Participants received either course credit or payment for their participation and signed an informed consent to their participation. The study was approved by the local Ethics Committee (the University of California, Berkeley, Institutional Review Board) and was conducted in accordance with the Declaration of Helsinki.
Study Design Overview
Experimental procedure
Participants enrolled to the experiment in slots of two and were instructed together. Participants completed 2 tasks: a flower counting task followed by a memory recognition task.
Flower counting task: The task was completed by each participant twice: both alone (“alone” condition) and concurrently with a partner (“shared” condition). Partners for the shared condition were assigned by study staff and did not know each other personally prior to the experiment. The order in which participants completed these two conditions was counterbalanced. During the flower counting task, EEG was recorded from participants in both the Alone and Shared conditions.
In the alone condition, one participant performed the flower counting task in the experiment room while the other subject sat in the waiting room. In the shared condition, the two participants performed the task simultaneously sitting in opposite sides of a table while viewing stimuli on a shared screen (see Fig. 4). At the completion of each block, participants wrote down the number of their counted flowers on a small piece of paper, which was then folded, without seeing or sharing their answers with one another. In this way, each participant repeated the task twice (alone- shared -wait, or wait- shared -alone), with different stimuli in each run. Each run lasted ~10 minutes and viewing parameters were kept constant for both conditions. The order of the runs, i.e. shared or alone, was counterbalanced between participants, such that if participant A did the task alone, and then together with participant B, B did the task first with A, then alone.
Recognition task: in the second part of the experiment, participants were placed in separate rooms and were asked to complete a memory recognition task. Participants were not told in advance that they will be required to memorize the stimuli, yet when instructed to perform the recognition task they were informed that the other participant (their partner in the shared condition) is also performing such a task in another room. The whole procedure took about 60 minutes.
Stimuli
We created three sets of 180 stimuli each taken from 540 unique pictures of the International Affective Picture System (IAPS;16). Each set was comprised of 60 negative, 60 positive and 60 neutral pictures. The three sets were counterbalanced for their valence (M ± SD; negative = 2.47 ± 1.53; positive = 7.22 ± 1.59; neutral = 5.02 ± 1.34) and arousal (M ± SD; negative = 5.81 ± 2.19; positive = 5.01 ± 2.26; neutral = 3.48 ± 1.99), using the published IAPS norms16,22,23. Two of the sets were used for the two repetitions of the flower counting task (see below), counterbalanced between subjects, while the third set, along with half of the images from each of the first two sets, was used for the recognition task (see below).
The target stimuli (flowers) were comprised of random pictures downloaded from the internet, approved for common use. Each picture depicted one flower. Note that emotional reactivity for these photos was not assessed, thus stimuli may have elicited a slightly positive emotional response in participants. Participants were not told why they were seeing emotional pictures during the task. All pictures were 9×12 cm in size.
Behavioral Data Acquisition
Flower counting task
Each participant sat at a 45-degree angle, ~90 cm from a desktop screen (see Fig. 4). E-Prime 2.0 Professional software (Psychology Software Tools, Inc., Pittsburgh, PA) was used for stimulus presentation, using a Lenovo computer with a CRT monitor (ViewSonic P225f). On each trial, a fixation point was presented at the center of the screen for 500 ms, immediately followed by the stimulus, which was displayed for 1000 ms. Each stimulus contained either an IAPS picture (with equal probability for each stimulus type: neutral/negative/positive) or a flower picture, presented in a randomized order. The participant’s task was to silently count the number of flower pictures that appeared during the block and to write down their total number at the end of each block, a number which varied between 8 and 12. Participants completed two runs of this task (alone/shared), with each run comprised of 3 blocks of 60 IAPS pictures each and 8–12 target stimuli, lasting for up to 2 minutes. Participants were given the opportunity to rest in between blocks, hence a full run lasted ~10 minutes. Figure 4 has an illustration of the dual EEG setup and Fig. 5 depicts the experimental design.
Recognition task
Following the flower counting task, participants sat in separate rooms for a ‘surprise’ forced-choice recognition test. In order to test the effects of shared attention on recognition, we used a subset of 90 pictures from each of the two viewing conditions (alone/shared) and added a third set of 180 novel pictures that participants have not seen before, equated for valence and arousal to the first two sets, for a total of 360 IAPS pictures. Thus, the recognition task was comprised of 180 pictures that participants already saw during the flower counting task, and 180 novel pictures.
Participants were asked to decide, for each stimulus, whether they saw it during the flower counting task their performed earlier, by pressing one of two buttons presented on the screen (yes/no). Participants were given an unlimited time to respond. As soon as their response was recorded, the next stimulus appeared on the screen.
EEG Data Acquisition
EEG recording
EEG recordings were performed during the flower counting task. EEG was recorded continuously from 64 Ag-AgCl pin-type active electrodes mounted on an elastic cap (Biosemi, http://www.biosemi.com/headcap.htm), according to the extended 10–20 system, and from two additional electrodes placed at the right and left mastoids. Data was recorded relative to CMS/DRL electrodes located between POz and PO3, while average voltage was kept in the range of ±40 mV signal using Biosemi’s electrodes offset tool. All electrodes were subsequently re-referenced digitally (see data processing below). Eye movements and blinks were monitored using bipolar horizontal and vertical Electro-oculography (EOG) derivations via two pairs of electrodes, with one pair attached to the external canthi and the other to the infraorbital and supraorbital regions of the right eye. Both EEG and EOG were digitally amplified and sampled at 1024 Hz using a Biosemi Active II system (www.biosemi.com). Triggers were sent to the Biosemi software and recorded along with the EEG data using a parallel port. When two participants were run together (Shared condition), one amplifier was ‘daisy-chained’ to the other, which then sent all information to the experimental computer. This ensured that the data received from both amplifiers was synchronized.
Behavioral Data Analysis
For the flower counting task, we calculated the overall accuracy rate for each condition (alone/ shared). One participant had zero success rate in all blocks of the flower counting task and was hence excluded from analyses. Another participant was excluded due to missing data.
Accuracy rate was calculated as the ratio between the absolute distance between the number of counted flowers and the correct number of flowers, divided by the correct number of flowers in each experimental block, multiplied by 100. [Accuracy rate = 100 − (error *100), when Error (proportion) = the absolute distance (“how many I said there were” minus “how many were there”), divided by the number of flowers (= “how many were there”)].
For the recognition task, d’ was calculated as the difference between the number of hits (H; answering “yes” correctly) and false alarms (FA; answering “yes” incorrectly). Z-scores and sensitivity scores (d′ = z(H) − z(FA)) were derived in order to correct for the reported bias to say that emotional stimuli appeared54,55.
EEG Data Analysis
EEG data processing
The EEG data was analyzed offline using the Brain Vision Analyzer software (Brain Products; www.brainproducts.com). Data was first filtered with a high-pass filter of 0.5 Hz and with a notch filter of 60 Hz (zero-phase shift IIR filter, 4th order) and re-referenced to the common average activity from all electrodes. Individual noisy channels were elected using a semi-automated data inspection. Next, a low-pass filter at 30 Hz (zero-phase shift IIR filter, 4th order) was applied. Blinks and eye movement artifacts were identified and corrected using the Independent Component Analysis method (ICA infomax;56). Remaining EEG artifacts exceeding ±120 µV, with a voltage step of more than 50 µv/ms, activity under 0.5µv, or with a difference (max-min) of more than 150 µv were detected, and the data during an epoch of 300 ms symmetrically encompassing the event were excluded from the analysis.
Event related potential (ERP) analysis
The continuous, artefact-corrected EEG data was segmented separately for the two experimental conditions (alone/shared) and for each stimulus type (neutral/negative/positive/flower). Epochs were extracted for the appearance of the stimuli in windows of [−200 1000 ms], and the window of [−150:−50 ms] was used for baseline correction. Signals were averaged for each subject for each of the 8 resultant conditions (4 stimulus types in each of the 2 experimental conditions). In each block, the number of epochs was between 49–60 for each valence condition, and between 21–32 epochs for flower stimuli.
The Late-Positive Potential (LPP) was calculated for the neutral, negative and positive stimuli, as the mean activity in the 400–800 ms post-stimulus, in midline posterior-parietal electrodes (Pz CPz POz). This electrode combination is most frequently used to calculate LPP27,57,58,59,60,61,62. However, since some authors calculate LPP over a posterior-parietal electrode composition (C1, C2, CP1, CP2, Cz, CPz/Pz)57,59,61,62,63, we ran a second analysis using these electrode sites. We also performed a control analysis by calculating LPP mean activity in frontal sites (Fpz, Fp1, Fp2), to confirm that any observed differences between conditions were specific to the parietal LPP (see Supplementary Information).
The P3b was calculated for target stimuli (flowers) as the mean activity 350–600 ms post-stimulus, at the same posterior-parietal electrode locations (Pz CPz POz)13,63. We also ran a permutation analysis, each time iteratively (X 10000) randomly sampling (with return) 21 target and 21 Non-target items and recomputing test statistics.
Band power analysis
Alpha band power was computed for each participant, channel, and epoch. First, epochs were submitted to a Fast Fourier Transform (FFT; 0.5 Hz resolution, Hanning window within 10% overlap). Resulting amplitude values were squared to obtain power, averaged across frequency bins between 8 and 13 Hz, and normalized via log-transformation (log10). Mean alpha band power was subsequently averaged across segments in each experimental condition. We then transformed mean alpha power into a logarithmic scale, in order to normalize the data and dismiss any general individual changes in activity. Last, we averaged across occipital electrodes O1, O2 and Oz43 for alpha activity, and across central electrodes C3 and C4 for mu activity31,37,38,45.
Statistical analysis
Statistical analyses were first performed using the statistical software package SPSS (IBM, version 20). Differences in accuracy rates of the flower counting task between conditions were analyzed using a paired Student’s t-test. Accuracy rates of the recognition task, d’ measures, alpha band values, and LPP activity values were all analyzed using a two-way repeated-measures Analysis of Variance (ANOVA) with a Greenhouse-Geiser correction with within-subject factors of condition (2 levels: alone and shared) and valence (3 levels: negative, neutral, positive). P3b values were analyzed using a two-way repeated-measures ANOVA with within-subject factors of condition (2 levels: alone and shared) and stimulus type (2 levels: target (flower) and non-target (neutral IAPS)). In all analyses, the Bonferroni correction was applied to correct for multiple comparisons. Results were considered significant at the level of p < 0.05.
To examine the strength of evidence to support null results, we further ran a Bayesian statistical analysis using the JASP software (Version 0.9.1)64, applying it for each of the previously described statistical tests. As we had no previous knowledge for prior probabilities, we referred to H1 and H0 as equal (Prior ratio = 1) and use the Bayes Factor (BF10) as our statistical measure in these tests65,66,67. Bayes Factor for interactions was extracted by the BF of the models with and without the interaction68.
Since the number of flowers and neutral stimuli were different, we further ran a permutation analysis for the P3b signal, using MATLAB (MathWorks, version R2018b). In order to create comparable sample sizes in each ERP calculation (the number of segments was 21–32 for flower events and between 53 to 60 for neutral IAPS events), we randomly sampled 21 segments with return from each stimulus type and repeated the procedure 10,000 times. The reported results correspond to the mode F across iterations, given degrees of freedom for the test (alpha level = 0.05; 95% CI).
Data availability
The datasets generated during the current study are available from the corresponding author on reasonable request.
References
Lakin, J. L., Jefferis, V. E., Cheng, C. M. & Chartrand, T. L. The Chameleon Effect as Social Glue: Evidence for the Evolutionary Significance of Nonconscious Mimicry. Journal of Nonverbal Behavior 27, 145–162 (2003).
Neuberg, L. S., Kenrick, T. D. & Schaller, M. Evolutionary Social Psychology. in Handbook of social psychology. (eds. Fiske, S. T., Gilbert, D. & Lindzey, G.) 1–32 (John Wiley & Sons, 2009).
Böckler, A., Knoblich, G. & Sebanz, N. Effects of a Coactor’s Focus of Attention on Task Performance. J. Exp. Psychol. Hum. Percept. Perform. 38, 1404–1415 (2012).
Shteynberg, G. Shared Attention. Perspect. Psychol. Sci. 10, 579–590 (2015).
Wagner, U. et al. Beautiful Friendship: Social Sharing of Emotions Improves Subjective Feelings and Activates the Neural Reward Circuitry. Soc. Cogn. Affect. Neurosci. 10, 801–808 (2014).
Boothby, E. J., Clark, M. S. & Bargh, J. A. Shared Experiences Are Amplified. Psychol. Sci. 25, 2209–2216 (2014).
He, X., Lever, A. G. & Humphreys, G. W. Interpersonal Memory-Based Guidance of Attention is Reduced for Ingroup Members. Exp. Brain Res. 211, 429–438 (2011).
Shteynberg, G. & Apfelbaum, E. P. The Power of Shared Experience: Simultaneous Observation With Similar Others Facilitates Social Learning. Soc. Psychol. Personal. Sci. 4, 738–744 (2013).
Eskenazi, T., Doerrfeld, A., Logan, G. D., Knoblich, G. & Sebanz, N. Your Words are My Words: Effects of Acting Together on Encoding. Q. J. Exp. Psychol. 66, 1026–1034 (2013).
Shteynberg, G. A Silent Emergence of Culture: The Social Tuning Effect. J. Pers. Soc. Psychol. 99, 683–689 (2010).
Carr, P. B. & Walton, G. M. Cues of Working Together Fuel Intrinsic Motivation. J. Exp. Soc. Psychol. 53, 169–184 (2014).
Shteynberg, G. et al. Feeling More Together: Group Attention Intensifies Emotion. Emotion 14, 1102–1114 (2014).
Peterburs, J., Liepelt, R., Voegler, R., Ocklenburg, S. & Straube, T. It’s not Me, it’s You - Differential Neural Processing of Social and Non-social NoGo Cues in Joint Action. Soc. Neurosci. 14, 1–11 (2017).
Jolly, E., Tamir, D., Burum, B. & Mitchell, J. P. Wanting Without Enjoying: The Social Value of Sharing Experiences. https://doi.org/10.17605/OSF.IO/B3ZJU. (2018)
Raghunathan, R. & Corfman, K. Is Happiness Shared Doubled and Sadness Shared Halved? Social Influence on Enjoyment of Hedonic Experiences. J. Mark. Res. 43, 386–394 (2006).
Lang, P. J., Bradley, M. M. & Cuthbert, B. N. International Affective Picture System (IAPS): Technical Manual and Affective Ratings. NIMH Cent. Study Emot. Atten. 39–58 (1997). https://doi.org/10.1027/0269-8803/a000147
Jakobs, E., Manstead, A. S. R. & Fischer, A. H. Social Context Effects on Facial Activity in a Negative Emotional Setting. Emotion 1, 51–69 (2001).
Fridlund, A. J. Sociality of Solitary Smiling: Potentiation by an Implicit Audience. J. Pers. Soc. Psychol. 60, 229–240 (1991).
Echterhoff, G. & Kopietz, R. The socially shared nature of memory: From joint encoding to communication. Collab. Rememb. Theor. Res. Appl. 113–134 (2018).
Wagner, U., Giesen, A., Knausenberger, J. & Echterhoff, G. The Joint Action Effect on Memory as a Social Phenomenon: The Role of Cued Attention and Psychological Distance. Front. Psychol. 8, 1697 (2017).
Elekes, F., Bródy, G., Halász, E. & Király, I. Enhanced encoding of the co-actor’s target stimuli during a shared non-motor task. Q. J. Exp. Psychol. 69, 2376–2389 (2016).
Mikels, J. A. et al. Emotional category data on images from the International Affective Picture System. Behav Res Methods 37, 626–630 (2005).
Olofsson, J. K., Nordin, S., Sequeira, H. & Polich, J. Affective Picture Processing: An Integrative Review of Erp Findings. Biological Psychology 77, 247–265 (2008).
Codispoti, M., Ferrari, V. & Bradley, M. M. Repetition and Event-related Potentials: Distinguishing Early and Late Processes in Affective Picture Perception. J. Cogn. Neurosci. 19, 577–586 (2007).
Lavoie, M. E. & O’connor, K. P. Effect of Emotional Valence on Episodic Memory Stages as Indexed by Event-Related Potentials. World J. Neurosci. 3, 250–262 (2013).
Schupp, H. T., Junghöfer, M., Weike, A. I. & Hamm, A. O. Emotional Facilitation of Sensory Processing in the Visual Cortex. Psychol. Sci. 14, (2003).
Choi, D. et al. Effect of Empathy Trait on Attention to Various Facial Expressions: Evidence from N170 and Late Positive Potential (LPP). J. Physiol. Anthropol. 33, 1–9 (2014).
Luck, S. J. An Introduction to the Event-Related Potential Technique. (MIT Press, 2005).
Delplanque, S., Lavoie, M. E., Hot, P., Silvert, L. & Sequeira, H. Modulation of Cognitive Processing by Emotional Valence Studied Through Event-Related Potentials in Humans. Neurosci. Lett. 356, 1–4 (2004).
Polich, J. Updating P300: An Integrative Theory of P3a and P3b. Clinical Neurophysiology 118, 2128–2148 (2007).
Dumas, G., Nadel, J., Soussignan, R., Martinerie, J. & Garnero, L. Inter-Brain Synchronization during Social Interaction. PLoS One 5, e12166 (2010).
Pineda, J. A. The functional significance of mu rhythms: Translating “seeing” and “hearing” into “doing”. Brain Res. Rev. 50, 57–68 (2005).
Fox, N. A. et al. Assessing human mirror activity with EEG mu rhythm: A meta-analysis. Psychol. Bull. 142, 291–313 (2016).
Gonzalez-Liencres, C., Shamay-Tsoory, S. G. & Brüne, M. Towards a neuroscience of empathy: Ontogeny, phylogeny, brain mechanisms, context and psychopathology. Neurosci. Biobehav. Rev. 37, 1537–1548 (2013).
Oberman, L. M. et al. EEG evidence for mirror neuron dysfunction in autism spectrum disorders. Cogn. Brain Res. 24, 190–198 (2005).
Perry, A., Bentin, S., Bartal, I. B. A., Lamm, C. & Decety, J. ‘Feeling’ the Pain of Those Who Are Different from Us: Modulation of EEG in the mu/alpha Range. Cogn. Affect. Behav. Neurosci. 10, 493–504 (2010).
Yin, J., Ding, X., Xu, H., Zhang, F. & Shen, M. Social Coordination Information in Dynamic Chase Modulates EEG Mu Rhythm OPEN. Sci. Rep. 7, 4782 (2017).
Naeem, M., Prasad, G., Watson, D. R. & Kelso, J. A. S. Electrophysiological signatures of intentional social coordination in the 10–12 Hz range. Neuroimage 59, 1795–1803 (2011).
Karakale, O., Moore, M. R. & Kirk, I. J. Mental Simulation of Facial Expressions: Mu Suppression to the Viewing of Dynamic Neutral Face Videos. Front. Hum. Neurosci. 13, 34 (2019).
Perry, A., Troje, N. F. & Bentin, S. Social Neuroscience Exploring motor system contributions to the perception of social information: Evidence from EEG activity in the mu/alpha frequency range. https://doi.org/10.1080/17470910903395767. (2010)
Ensenberg, N. S., Perry, A. & Aviezer, H. Are you looking at me? Mu suppression modulation by facial expression direction. Cogn. Affect. Behav. Neurosci. 17, 174–184 (2017).
Sauseng, P. et al. A Shift of Visual Spatial Attention Is Selectively Associated with Human EEG Alpha Activity. Eur. J. Neurosci. 22, 2917–2926 (2005).
Woodruff, C. C., Daut, R., Brower, M. & Bragg, A. Electroencephalographic α-band and β-band Correlates of Perspective-Taking and Personal Distress. Neuroreport 22, 744–748 (2011).
Klimesch, W. Alpha-band Oscillations, Attention, and Controlled Access to Stored Information. Trends in Cognitive Sciences 16, 606–617 (2012).
Tognoli, E., Lagarde, J., DeGuzman, G. C. & Kelso, J. A. S. The phi complex as a neuromarker of human social coordination. Proc. Natl. Acad. Sci. 104, 8190–8195 (2007).
Baumgartner, T., Esslen, M. & Jäncke, L. From Emotion Perception to Emotion Experience: Emotions Evoked by Pictures and Classical Music. Int. J. Psychophysiol. 60, 34–43 (2006).
Zajonc, R. B. Social Facilitation. Sci. New Ser. 149, 269–274 (1965).
Baron, R. S. Distraction-Conflict Theory: Progress and Problems. Adv. Exp. Soc. Psychol. 19, 1–40 (1986).
Schupp, H. T., Schmälzle, R., Flaisch, T., Weike, A. I. & Hamm, A. O. Reprint of “Affective Picture Processing as a Function of Preceding Picture Valence: An ERP Analysis”. Biol. Psychol. 92, 520–525 (2013).
diFilipo, D. & Grose-Fifer, J. An Event-Related Potential Study of Social Information Processing in Adolescents. PLoS One 11, e0154459 (2016).
Kret, M. E. & De Gelder, B. A Review on Sex Differences in Processing Emotional Signals. Neuropsychologia 50, 1211–1221 (2012).
Han, S., Fan, Y. & Mao, L. Gender Difference in Empathy for Pain: An Electrophysiological Investigation. Brain Res. 1196, 85–93 (2008).
Bianchin, M. & Angrilli, A. Gender Differences in Emotional Responses: A Psychophysiological Study. Physiol. Behav. 105, 925–932 (2012).
Kensinger, E. A. & Corkin, S. Memory Enhancement for Emotional Words: Are Emotional Words More Vividly Remembered Than Neutral Words? Mem. Cogn. 31, 1169–1180 (2003).
Dougal, S. & Rotello, C. M. ‘Remembering’ Emotional Words Is Based on Response Bias, Not Recollection. Psychon. Bull. Rev. 14, 423–429 (2007).
Jung, T. et al. Removing Electroencephalographic Artifacts by Blind Source Separation. Psychophysiology 37, 163–178 (2000).
Adamaszek, M. et al. Event-Related Potentials Indicating Impaired Emotional Attention in Cerebellar Stroke-A Case Study. Neurosci. Lett. 548, 206–211 (2013).
Benning, S. D. et al. Late Positive Potential ERP Responses to Social and Nonsocial Stimuli in Youth with Autism Spectrum Disorder. J. Autism Dev. Disord. 46, 3068–3077 (2016).
MacNamara, A. & Hajcak, G. Anxiety and Spatial Attention Moderate the Electrocortical Response to Aversive Pictures. Neuropsychologia 47, 2975–2980 (2009).
Moser, J. S., Hajcak, G., Bukay, E. & Simons, R. F. Intentional Modulation of Emotional Responding to Unpleasant Pictures: An ERP Study. Psychophysiology 43, 292–296 (2006).
Suess, F. & Abdel Rahman, R. Mental Imagery of Emotions: Electrophysiological Evidence. Neuroimage 114, 147–157 (2015).
Wu, L. et al. Empathy, Pain and Attention: Cues that Predict Pain Stimulation to the Partner and the Self Capture Visual Attention. Front. Hum. Neurosci. 11, 465 (2017).
Schupp, H. T., Flaisch, T., Stockburger, J. & Junghöfer, M. Emotion and Attention: Event-Related Brain Potential Studies. Prog. Brain Res. 156, 31–51 (2006).
JASP Team. JASP (Version 0.9)[Computer software]. (2018).
Wetzels, R. et al. Statistical Evidence in Experimental Psychology: An Empirical Comparison Using 855 t Tests. Perspect. Psychol. Sci. 6, 291–298 (2011).
Wagenmakers, E. J., Morey, R. D. & Lee, M. D. Bayesian Benefits for the Pragmatic Researcher. Curr. Dir. Psychol. Sci. 25, 169–176 (2016).
Dienes, Z. Bayesian Versus Orthodox Statistics: Which Side Are You On? Perspect. Psychol. Sci. 6, 274–290 (2011).
Mathôt, S. Bayes like a Baws: Interpreting Bayesian Repeated Measures in JASP. Available at: https://www.cogsci.nl/blog/interpreting-bayesian-repeated-measures-in-jasp (2017).
Acknowledgements
This work was supported in part by an Azrieli Fellowship from the Azrieli Foundation to A.P, R37NS21135 from the NINDS foundation to R.T.K. We want to thank Melissa Y. Reyes for the illustration in Fig. 4.
Author information
Authors and Affiliations
Contributions
A.P. and R.T.K. designed the experiment, A.P. and A.S. ran the experiment, N.M., M.N. and A.P. analyzed the data. All authors discussed the results and contributed to the final manuscript.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary information
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Mairon, N., Nahum, M., Stolk, A. et al. Behavioral and EEG Measures Show no Amplifying Effects of Shared Attention on Attention or Memory. Sci Rep 10, 8458 (2020). https://doi.org/10.1038/s41598-020-65311-7
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41598-020-65311-7
This article is cited by
-
Physical but not virtual presence of others potentiates implicit and explicit learning
Scientific Reports (2022)
Comments
By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.