Discovering the psychological building blocks underlying climate action—a longitudinal study of real-world activism

We are in a climate emergency. Because governments are reacting too slowly, grassroots collective action is key. Understanding the psychological factors underpinning engagement can facilitate the growth of such collective action. Yet, previous research in psychology rarely provided causal evidence for which factors trigger action, lacked focus on the climate crisis, was mostly self-reported behaviour or intentions rather than objective measures, and was mostly cross-sectional rather than longitudinal. Here we conducted a longitudinal study on the effectiveness of a 12-week video intervention designed to increase psychological predictors of collective action. The intervention boosted affective engagement, collective efficacy, and self-efficacy, but did not increase observed attendance of activism events. Interviews suggested that Zoom fatigue and the online study design undercut the social interaction participants wanted in order to join events. However, a smaller in-person replication did not increase activism either. Debriefings suggested that the replication participants were primarily motivated by payment and lacked time or resources for more engagement. These results highlight the crucial importance of going beyond measures of self-reported attitudes or intentions to objectively measuring activism behaviours and showing the difficulty of fostering event attendance.


Introduction
Since the industrial revolution in the early 1800s, humans have added carbon dioxide (CO 2 ) and other greenhouse gases to the

Current study
We designed an online study to address some of these weaknesses. The aim of this study was not to test any particular theory, but rather to develop a novel methodology that clearly identifies which particular psychological factors may cause climate activism. To do this, we introduced two novel tools: (1) an intensive video intervention attempting to boost 11 psychological factors, and (2) a behaviour-tracking methodology to measure actual participation in Zoom activism events organized by two climate organizations.
Specifically, we conducted a three-month longitudinal study aimed at boosting psychological factors such as collective efficacy and social norms. Although our review of the literature revealed more than a dozen possible psychological factors, we settled on just 11. These both had the most correlational evidence with activism behaviour, and fit our intuitions as climate activists of being relevant. The main goal of our study was to see which psychological factors changed in response to the intervention and to identify which of those changes predict shifts in behaviour. The main measure of activism behaviour in this study was event participation (objectively recorded). We embedded a study team member in the UCSD Green New Deal and SD350 climate groups and verified participation in events held by both organizations. Additionally, self-reported environmental education and leadership behaviour was collected, along with other pro-environmental behaviours relevant to climate change. Finally, a semistructured interview based on the confirmatory results provided qualitative data about activists' own experiences and perceived barriers. All anonymized data, code and materials are available at the Open Science Framework: https://osf.io/38vkz/?view_only=fbc1a81292a5404c9418c83c5fa06c93.

Study 1 2.1. Methods
Please see the Supplementary Materials for the screening questions, baseline and follow-up surveys, the semi-structured interview schedule and the 12 videos and comprehension questions.

Sample size
We calculated the required sample based on an effect size of f 2 = 0.15 for the regression analysis of hypothesis H3 (see below). This smallest effect size of interest was determined by aggregating across the literature of the 11 psychological factors. The R package pwr was used to perform a power analysis for a linear regression with alpha = 0.05, power = 0.95, effect size f 2 = 0.15 and numerator d.f. = 5. This power analysis yielded N = 143. Based on our earlier experience recruiting student activists through classes and educational materials, we anticipated 30% of participants would join at least one activism event, and half of these participants would develop an enduring commitment. Further, we expected about 10% of participants to drop out during the study. Therefore, we planned to recruit N = 160 for the main experimental group.

Outliers
Outliers were not relevant to our outcome measures as they are bounded by limited response options or limited available events.

Design
We used a single-cohort pre-post design with a pre-intervention baseline phase.

Recruitment
We recruited 170 UCSD students between 18 and 38 years of age through online department platforms and by flyers posted on the campus of UC San Diego. The inclusion criteria were that participants were enrolled at UCSD, that they believed in anthropogenic global heating and that they had no or low prior engagement in climate activism (see Screening Survey). The drop-out rate was higher than expected: 30% of the participants dropped out before the study began, and 13% dropped out during the three-month study period, so the final sample was N = 96 (22 males, 72 females and 2 unspecified). 20% of the royalsocietypublishing.org/journal/rsos R. Soc. Open Sci. 9: 210006 participants were Caucasian, 51% Asian, 18% Hispanic/Latino and 11% had other ethnic/racial backgrounds. 75% of the participants identified as Democrats, 5% as Republicans and 20% as Other.

Screening survey
Participants underwent an initial screening for age (18-40) and then beliefs in anthropogenic global heating through two questions: 'Regardless of its cause, I am certain that climate change is actually occurring' (yes/ no), and 'Human activities are a significant cause of climate change' (yes/no) (figure 1a). Participants answering no to either were excluded.   royalsocietypublishing.org/journal/rsos R. Soc. Open Sci. 9: 210006 A concern in our design was that during the study some participants might attend events of climate organizations other than UCSD Green New Deal and SD350 and that we would not be able to tell. To limit this possibility, we ensured people had no history of activism. Being engaged in activism outside our study would have made it more likely for a participant to attend an activist event that we could not track during the study. Therefore, we asked everyone at recruitment if they had ever attended an event organized by an environmental organization (such as a protest). Those answering yes were then asked how many times they attended such an event (once, more than once). 15 participants responding 'more than once' were excluded from the study.
The remaining participants read an instruction sheet and approved a consent form about the timeline of the study and their responsibilities over the three-month period. This included rules for payment penalties based on not completing the tasks. Then, each week for 12 weeks, participants received two emails with links to a Qualtrics survey (one link each at the beginning and the end of the week).

Baseline period (six weeks)
For the first six weeks, the Qualtrics surveys received by the participants contained a 'climate events bulletin', consisting of a list of events happening in the next 3-4 days, held by our partner climate organizations (San Diego 350 and UCSD Green New Deal) (figure 1b). At the top of the bulletin, participants were reminded that while they were required to scroll to the end of the list, they were not required to attend these events (there was no financial penalty for not attending). To ensure that participants read the events bulletin, they were asked two attention-check questions at the end of each bulletin about the events. Responding incorrectly to more than one question potentially resulted in a $3 penalty for that session ($3 were subtracted from the final payment of $100); however, the participant who failed this check could read the bulletin once more and re-take the test to avoid losing the $3.

11 Psychological factors
For each factor, participants provided ratings on multiple items. Below is an example question for each of the 11 factors and their source. Affective engagement: 'When you think about a future impacted by climate change how strongly do you feel the following emotion? Fear…', rated from 1 (none at all) to 7 (very strongly) [12]. Collective efficacy: 'I feel confident about the capability of our society to address the climate crisis very well', rated from 1 (strongly disagree) to 7 (strongly agree) [16,28]. Perceived behavioural control: 'How much control do you have over whether you engage in climate activism?' from 1 (very little control) to 7 (a great deal of control) [17]. Social norm: 'If I engaged in environmental activism, people who are important to me would', from 1 (completely disapprove) to 7 (completely approve) [17]. Faith in institutions: 'How much do you trust the following group to do what is right in regard to the climate crisis? Governmental groups …', from 1 (never trust) to 7 (completely trust) [29]. Self-efficacy: 'I, personally, have the skills and resources to address climate change', from 1 (strongly disagree) to 7 (strongly agree) [30]. Identity: 'I see myself as a proenvironmentalist', from 1 (strongly disagree) to 7 (strongly agree) [31]. Attitudes: 'I think that engaging in climate activism is', from -3 (extremely bad) to +3 (extremely good)' [17]. Intention: 'I intend to engage in climate activism during the next 6 months', from 1 (extremely unlikely) to 7 (extremely likely) [17]. Openness/Imagination: 'I would like a job that requires following a routine rather than being creative', from 1 (disagree strongly) to 7 (agree strongly) [32]. Theory of change: 'If our society starts changing to help stop the climate crisis, will these changes start from politicians/governments or from people/grassroots movements?', from 1 (definitely politicians) to 7 (definitely people). This last factor is an ad-hoc measure of participants' belief on how successful change would be implemented: either as a bottom-up process stemming from grassroots action and expanding to the whole society or as a top-down process implemented by governments and authorities. We expected that belief in bottom-up change would more strongly increase the motivation to engage in action.

Demographics
Participants reported their age, gender, education, income, ethnicity and political affiliation (see electronic supplementary materials for full measures).

Video intervention
Our video intervention was designed to boost the 11 psychological factors. The intervention consisted of 12 videos, each of which contained a mixture of two or more of the 11 psychological factors of interest. Each video was built on a theme (1. intro, 2. environmental threat, 3. human threat, 4. energy sources, 5. politics, 6. climate justice, 7. the climate movement, 8. victims and perpetrators, 9. neoliberalism and consumerism, 10. obstacles to engage, 11. how change happens, 12. imagine a climate-friendly world) and the 12 themes were selected based on the thematic curriculum of a college course on climate change taught by Dr Aron. The choice to build thematic videos, rather than building each video around one of the 11 factors, was done for two reasons: first, we were hoping to repeat the mobilizing effect of Dr Aron's class by adopting its thematic design which has repeatedly motivated students to climate action; and second, to prevent the participants from recognizing too easily in the videos the factors we were trying to boost.
We now give a flavour of how we tackled each psychological factor across the 12 videos: affective engagement: footage of the dramatic impacts of the climate crisis on people's health and living condition (e.g. communities devastated by floods); collective efficacy: footage of collective climate action such as the Sunrise movement campaigns that achieved a shift in political priorities; social norms: interviews with professors and peer-student activists talking about their experiences, bonds and mentorship within their activist community; faith in institutions: a combination of the history of unproductive international and national climate policy initiatives, combined with examples of current politicians that have an earnest climate crisis focus; self-identity: scenes that would engage, for example, self-identified naturalists and social justice advocates; self-efficacy and attitudes: testimonies of other students devoting themselves to making change; theory of change: interviews of activists and academics analysing the history of social movements and the feasibility of local climate activism in driving policy change; and imagination: hypothetical scenarios of a future world following a transition away from fossil fuels. The two other factors of perceived behavioural control and intentions were covered by including footage of student activists addressing the feasibility of climate activism as yet another extra-curricular activity.
During the intervention period, participants received two Qualtrics surveys per week for six weeks. Each Qualtrics survey began with a 20-minute video that they were required to watch from beginning to end (figure 1d). They were asked to avoid distractions and to put away their devices. The video software dictated the pace of viewing (i.e. they could not advance beyond the video without the total time of the video having passed). After the video, the participants were then prompted to read a 'climate events bulletin' which was identical to the climate event bulletins they received during the baseline period, but with new events (figure 1d). Again, at the top of the bulletin, participants were reminded that they were not penalized for not attending these events. They were then asked seven attention-check questions, five covering the video and two covering the event bulletin. Responding incorrectly to three or more questions resulted in a $3 penalty for that overall video session ($3 were subtracted from the final payment of $100), and the participants were redirected to watch the video again and to re-take the final test to avoid losing the $3 (they only had two attempts).

Follow-up survey
After the 12-week intervention, participants completed a follow-up survey very similar to the baseline survey but without demographics (figure 1e).
royalsocietypublishing.org/journal/rsos R. Soc. Open Sci. 9: 210006 2.1.8.1. Activism behaviours We embedded a study team member in the partner organizations. She monitored how many climate activism events were attended by each participant. The events were held on Zoom during this period due to the COVID-19 pandemic. She did so by comparing all sign-in names on Zoom calls with the names of the participants of our study, and if any match was found, one 'event score' was added for each participant attending. This was repeated at every event so that for all participants of our experiment, there was a record showing which events they attended. Participants also self-reported their environmental education and leadership behaviours and their emissions reduction behaviours.

Semi-structured interview
Following the confirmatory analyses, we conducted semi-structured interviews within two months from the end of the study (figure 1f ) [34,35]. This was initially planned to gain a more nuanced understanding of how participants experienced the psychological factors that emerged as the best predictors of activism behaviour. Due to the lack of activism behaviour, these interviews were used to understand what held back the participants from engaging in the online events. The semi-structured interviews also explored participants' subjective beliefs about what caused any increase in the psychological factors from preto post-intervention. With these general aims in mind, and learning from each participant's quantitative results, we tailored an interview script. Possible questions could relate to changes or lack of change from baseline to follow-up, as well as the lack of engagement in climate action. An example script is shown in the electronic supplementary materials.

Exclusion criteria
The exclusion criteria were answering incorrectly to more than 10 attention-check questions (two bulletins per week, two questions per bulletin, for a total of 24 questions) during the baseline period or more than 24 attention check questions (two videos of five questions each +2 bulletins of two questions each per week, for six weeks, for a total of 84 questions). No participant was excluded for this reason.

Hypotheses
H1: One or more of the 11 psychological factors related to climate activism would increase from baseline to follow-up.
We expected that factor 3 (faith in institutions) might go up or down after the intervention. For example, some participants may have concluded from the material we showed that policy elites are unreliable, given the general failure of UN and governmental policy over 30 years. Others might have felt more faith in institutions when they saw how the Sunrise movement boosted congressional action. We therefore maintained neutral expectations for how this factor may change and how this variation could affect participation.
H2: Each behavioural outcome would increase from baseline to follow-up: (a) climate activism events attended (objective), (b) environmental education and leadership behaviours (self-report), and (c) emissions-reduction behaviours (self-report). H3: For each behavioural outcome, any change from baseline would be explained by changes in one or more of the psychological factors from the beginning to the end of the study.

Predictors
We calculated mean composites from the items within each of the 11 psychological factors. However, we note that factor 11 (theory of change) was novel, and Cronbach's α = 0.56 for this factor, at both time points. To reach the minimum reliability score of α = 0.6, one item was dropped, which led to α = 0.64. All other composites were α > 0.6 (see electronic supplementary materials for all alphas).

Outcomes
Objective participation in activism was computed for each participant as the mean number of events attended (from the events bulletin) during the baseline period and during the intervention period. The self-reported environmental education and leadership behaviours and emissions-reduction behaviours were computed separately for each timepoint as mean composites.
H1: We ran 11 paired-sample ( pre-post) t-tests, one for each psychological factor composite, using Bonferroni-Holm for multiple comparisons correction. H2: For each of the three behavioural outcomes we ran paired-samples t-tests between baseline and follow-up, using Bonferroni-Holm for multiple comparisons correction. H3: Given the lack of change of the behavioural outcome from pre-to post-intervention, this analysis was dropped.

Baseline psychological factors
According to our pre-registered report, we would test whether the baseline psychological factors predicted the outcome behaviours. However, given the lack of behavioural outcome change from preto post-intervention, this analysis was also dropped.

Semi-structured interview
We interviewed those participants who expressed their interest in this part of the study (n = 40). Interviews were conducted through online video via Zoom and lasted one hour. Audio of the interviews was recorded and transcribed for analysis, removing any identifiable information, and then destroyed. Interviews were coded and transcribed by NVivo Software. Following transcription, participants were contacted with a copy of the interview transcript for review and approval. We then conducted a content analysis of all transcripts, to identify common themes; we adhered to the guidelines of content analysis in [36]. Two coders started by familiarizing themselves with the data and collaborating over the generation of initial content codes. Following the establishment of the codebook, they each coded two specific transcripts (i.e. the same ones), discussed discrepancies and revised the codebook as necessary. The remaining transcripts were split between the two coders. Upon completion of the coding process, they then worked collaboratively to identify, review and define the themes that arose.

Post-hoc follow-up survey
Five months after the end of the study, a follow-up survey was sent to all the participants, asking them to report whether they had participated in any event organized by a climate organization after the end of the study. If they answered yes, they were asked to list the name(s) of the organization(s) and the number of events attended. This follow-up survey was designed and delivered after stage 1 approval.

Results
We started by testing hypothesis H1 that one or more of the 11 psychological factors related to climate activism would increase from baseline to follow-up. Three psychological factors significantly increased from pre-to post-intervention (after applying Bonferroni-Holm correction): affective engagement For H2, our prediction was that each behavioural outcome would increase from baseline to follow-up: (a) climate activism events attended (objective), (b) environmental education and leadership behaviours (self-report) and (c) emissions-reduction behaviours (self-report). However, there was scarcely any change in behavioural outcomes. For objective attendance of activism events, only two participants joined during the intervention (one participant joined five events, and the other participant joined one event); self-reported environmental education and leadership had no change (−0.02 out of 7), and self-reported emissions reduction also had no change (+0.01 out of 7, all ts (95) < 1.00, all ps > 0.99 all ds < 0.10).

Final interviews
Forty participants agreed to be interviewed at the end of the study. Based on the quantitative results above, the final interview script was structured to address two main questions: (1) what prevented participants from attending the online activism events? and (2) what led to the change in the psychological factors that a) increased on average across participants (affective engagement, selfefficacy and collective efficacy) and b) increased for an individual participant (i.e. at least +2 out of 7 on a factor)? Finally, participants were asked their opinion on ways to better design this study to achieve greater activist engagement.

Interviewee responses on the lack of behavioural change
Interviewees were told that most of the participants in the study did not attend the activism events advertised during the study time. They were asked why they thought most people did not attend and  why they themselves did not feel compelled to attend. As personal reasons were usually merged with speculations about other people's reasons, both these answers are compiled together in table 2. The most common obstacles were (1) Zoom fatigue, owing to the COVID-19 pandemic (during this study, students had been attending classes on Zoom for over a year); (2) perception of a lack of opportunities to socialize normally given the climate events were only on Zoom; (3) school workload and (4) being too busy due to other responsibilities such as a job or community organizing.

Pre-post psychological factor changes
The participants were asked why they thought affective engagement, collective efficacy and self-efficacy changed at the group level. A description of these results can be found in the electronic supplementary material (table S1), along with a description of why they thought some psychological factors increased for them personally (electronic supplementary material, table S2).

Post-hoc and exploratory analyses 2.3.1. Follow-up survey
After five months, only 7% (three) of the 40 participants reported having attended between one and five activism events organized by the UCSD Green New Deal (climate organization), and one participant said they attended three events at the not-monitored organization Grove. Based on the attendance records of UCSD Green New Deal, we confirmed that one participant became a sustained member of the organization. It was not possible to verify attendance for the other two participants. However, not all the organization's events happening over those five months required sign-up sheets, so it is unknown whether they participated in one of the unmonitored events. Regardless, event attendance was nearly zero after the study.

. Risk perception
Affective engagement had two subcategories: risk perception (questions such as 'How likely do you think it is, from 1 to 7, that a) worldwide, many people's standard of living will decrease, b) worldwide water shortages will occur….') and emotional response (questions such as 'When you think about a future impacted by climate change how much, from 1 to 7, of the following emotions do you feel? Guilt, Sadness, Fear, Shame…'). In the main analysis, risk perception and emotional response were collapsed into one composite score for affective engagement. In this post-hoc phase these items were analysed separately, which revealed that while risk perception changed from pre-to post-intervention ( p < 0.001, d = 0.43), the emotional response did not ( p = 1, d = 0.1). Additionally, the pre-post change for risk perception was greater than the pre-post change for emotional response (t 95 = 3.00, p = 0.003, d = 0.3).

Study 1 discussion
This study tested whether a longitudinal video intervention would trigger non-activists to join online climate action events and to test which psychological factors might account for such behavioural change. After the six-week video intervention, three psychological factors increased: affective engagement, collective efficacy, and self-efficacy. Exit interviews of 40 participants suggested that the main trigger of these changes was the video content, in particular the information about extreme weather events, health impacts and collective and individual activism. Contrary to our expectations, only two participants joined the activist events (all of which were online given the COVID-19 pandemic). After five months, one participant had become a sustained activist in a local organization. The interviews suggested that poor participation was overwhelmingly due to Zoom fatigue and the participants' perception that the Zoom format precluded typical social interactions. This opened the question of whether the intervention was weak due to this lack of social interaction or to the absence of change in the other eight psychological factors. Therefore, after the Registered Report Stage 1 In-Principle Acceptance and with the editor's permission, we ran an exploratory follow-up study with an in-person intervention where participants watched the videos in a social setting and could participate in on-campus activism events.

Unregistered pilot study 2
This was an in-person, six-week replication of Study 1 and did not include a baseline. The aim was to test whether the in-person format triggered any activist participation.

Methods
We recruited 66 participants assuming a similar dropout rate to Study 1 (about 30% or greater, given the more effortful in-person tasks), aiming for an N = 40 final sample. This small sample (smaller than the N prescribed by our power analysis for Study 1) seemed appropriate for a small pilot study with no planned comparison between two groups (experimental and baseline). 38 completed the full study (a 42% dropout rate). Recruitment was carried out as in Study 1, but this time the final reward amount was $150 per person to facilitate recruitment under pandemic conditions. As in Study 1, participants first underwent a screening survey to exclude those not believing in anthropogenic global heating and those who attended more than one climate activism event prior to the study (figure 1). Those passing the screening signed a consent form agreeing to complete online surveys and to attend a 1.5 h in-person event every week for six weeks. Everyone first completed a baseline survey measuring the same 11 psychological factors and self-reported climate-related behaviors as Study 1. Then, every Monday night, participants gathered in a classroom at UC San Diego. Here, a study member coordinated an ice-breaking activity: solving riddles in groups of three or four. Then, she showed two 25-min videos (the same videos from Study 1), after which the participants took a four-question comprehension quiz on their mobile phone. They were prompted to respond to the last two questions of the quiz collectively in groups of three to four to engage each other in a conversation about the video content. After this, an activist guest from the climate org UCSD Green New Deal came into the room and described a bulletin of events for her organization for the following five days. A comprehension quiz followed with participants responding to the last two questions in groups. If participants did not show up at the Monday session or did not complete any of the comprehension quizzes, they incurred a $20 penalty. At the activism events, a confederate royalsocietypublishing.org/journal/rsos R. Soc. Open Sci. 9: 210006 tracked attendance by having all attendees fill in a sign-up sheet. At the end of the six weeks, participants again completed the survey measuring the 11 psychological factors and self-reported behaviors. There was a group debrief activity during the last in-person session.
We hypothesized that the in-person intervention would increase some of the 11 psychological factors and attendance of activist events.
Only two participants (5%) engaged in objectively verified activist events (they attended one event each). During the final group debrief, participants most often reported that they did not participate because of the lack of time, scheduling conflicts and lack of payment. Self-reported activist behaviour did not increase, neither for environmental education and leadership (+0.09 out of 7), nor for emissions reduction behaviors (+0.21 out of 7) (all ts 37 < 1.60, all ps > 0.10 all ds < 0.30).

Discussion
Study 2 was an additional, in-person intervention study where videos were watched in a social context, event bulletins were presented by a real activist (rather than emailed in a written list), and activism events were held in person on campus rather than on Zoom. However, only two participants (5%) each joined a single event. This small study suggests that including a social component, in-person study events, and activist events is not sufficient to trigger attendance at activist events, at least with this study design in this population and during the late-2021 pandemic.

General discussion
Three psychological factors were boosted by the Study 1 intervention (self-efficacy, collective efficacy and affective engagement) and only two of 96 participants attended activist events (2%). The main reasons they gave for not attending were Zoom fatigue and a perceived lack of social interaction at online events. The in-person intervention in Study 2 increased two factors: self-efficacy (like in Study 1) and identity, and only two out of 38 participants attended an activist event (5%).

Change in psychological factors
Across both studies, the video intervention boosted self-efficacy, such that participants felt empowered and more aware that their skills could contribute to climate activism. The intervention also boosted affective engagement, collective efficacy (Study 1), and identity (Study 2). Only four of the 11 factors changed in either study, suggesting that changes this size in these four factors alone were not sufficient to trigger action. Re-designing the videos might help to boost all 11 factors. Future research could employ a professional videographer, create content that is more local to the participants and use forms of narrative structure, perhaps by involving actors. Or, additional factors might need to be discovered and targeted. It remains possible that large increases in the 11 factors might not be sufficient to trigger action when participants do not have the time or financial resources to volunteer for activism.

Recruitment incentives
When advertising the studies (with a reward of $100 for Study 1 and $150 for Study 2) we hid the climate crisis content to attract a neutral (and generalizable) student audience rather than a self-selecting audience already interested in climate action. For Study 2 the payment was increased to $150 because few students were signing up, probably due to pandemic conditions. The informal debriefing for Study 2 strongly indicated that participants were short of money and time and expected payment to join the activist events.
In our assessment, the central reason why our two studies were not effective in driving more people to participate in climate activism was that they were participating mostly for payment. This reflects the socioeconomic status and material realities of a segment of the UC San Diego undergraduate population, royalsocietypublishing.org/journal/rsos R. Soc. Open Sci. 9: 210006 many of whom both attend school and work jobs. Students did appear to care about the climate, but they appeared not to have the time and resources to enter these particular activist spaces. This observation leads to the fundamental question of how to overcome the near-universal barriers of time and financial resources to engage in activism.
One potential incentive to join action for those lacking time or financial resources could be the support system that is often created inside an activist group-something our study did not encourage. Our personal observations point to emotional and social benefits of participation. For example, when activists share values and struggles inside the activist space, they become empathetic with each other and can learn solidarity skills of mutual aid and support to overcome engagement obstacles. For example, they can cover each other's work shifts to allow one to go to a rally or help each other find jobs within activism or advocacy. These connections with other activists also lead to valuable professional networks. Accordingly, interventions might trigger more engagement by advertising these benefits or designing the intervention to foster these exchanges. We also recommend providing more event times that fit busy schedules, and specifically recruiting participants already interested or engaged in climate activism.

Conclusion
The intensive video intervention increased several psychological factors associated with collective action on climate, but only a few participants attended activist events. The main weakness in this design may have been the reliance on paid study participation: we appear to have inadvertently selected some of the financially challenged students in our community who had little time and resources for volunteer activities. The study design could be strengthened in multiple ways from better video-making to creating different participation opportunities.
The central question of this study remains highly relevant: how can people be moved to collective action to protect our climate during a climate collapse? This study sheds light on how difficult it is to trigger this behaviour, which is even more reason for psychology studies using longitudinal designs and objective measurements of key behaviors.
Ethics. Our procedures were approved by the University of California, San Diego, Human Research Protections Programme, with project no. IRB #201545.