The Psychological Impacts and Message Features of Health Misinformation
A Systematic Review of Randomized Controlled Trials
Abstract
Abstract: What does health misinformation look like, and what is its impact? We conducted a systematic review of 45 articles containing 64 randomized controlled trials (RCTs; N = 37,552) on the impact of health misinformation on behaviors and their psychological antecedents. We applied a planetary health perspective by framing environmental issues as human health issues and focusing on misinformation about diseases, vaccination, medication, nutrition, tobacco consumption, and climate change. We found that in 49% of the cases exposure to health misinformation damaged the psychological antecedents of behaviors such as knowledge, attitudes, or behavioral intentions. No RCTs evaluated the impact of exposure to misinformation on direct measures of health or pro-environmental behaviors (e.g., vaccination), and few studies explored the impact of misinformation on feelings, social norms, and trust. Most misinformation was based on logical fallacies, conspiracy theories, or fake experts. RCTs evaluating the impact of impossible expectations and cherry-picking are scarce. Most research focused on healthy adult US populations and used online samples. Future RCTs can build on our analysis and address the knowledge gaps we identified.
In recent years, health authorities have become increasingly concerned about the impact of misinformation on health behaviors. For example, the World Health Organization (WHO) has called for a global movement to mitigate the harm caused by health misinformation (WHO, 2020). The concerns and calls to action seem justified, as systematic reviews show that health misinformation on issues such as vaccination, pandemics, and smoking is widespread on social media (Suarez-Lledo & Alvarez-Galvez, 2021; Wang et al., 2019). Moreover, results from correlational survey studies show that exposure to health misinformation is associated with an increase in unhealthy behaviors or psychological antecedents (Luk et al., 2021; Singh et al., 2022).
However, the mere presence of health misinformation on social media is not an indicator of its impact, and correlational studies provide limited evidence on whether unhealthy behaviors or their antecedents are causally due to misinformation. Thus, one of the goals of this systematic review was to synthesize evidence from randomized controlled trials (RCTs) that evaluated the impact of exposure to health misinformation on human behaviors or their psychological antecedents.
Health Misinformation
Information and misinformation are often distinguished based on truth or falsehood. Hence, the term misinformation is commonly used to describe false information (Cacciatore, 2021; Wang et al., 2019). In applied research on misinformation, messages are usually labeled “false” if they explicitly contradict scientific consensuses, such as the causal link between tobacco smoking and lung cancer, or the effectiveness of vaccinations (Scheufele & Krause, 2019). Yet, defining misinformation based on truth value can be problematic for reviewing research on the impact of health misinformation because it can lead to the exclusion of content that is deeply misleading, if not entirely false, and that is highly impactful for global health. For example, in the 1950s, the tobacco industry used advertisement messages such as “More doctors smoke camelS than any other cigarette” to encourage cigarette consumption (SRITA, 2021). Some of these messages may have been true, but they were misleading insofar as the intent was to frame smoking as part of a healthy lifestyle – implicitly undermining the emerging scientific consensus that smoking causes cancer (Cummings & Proctor, 2014). In fact, implicatures – messages that omit relevant information – are measures that can be used to encourage people to adopt inaccurate beliefs that have negative health implications, without necessarily spreading false information per se (Søe, 2021). Thus, we use misleadingness versus non-misleadingness rather than falsehood versus truth to distinguish misinformation from information.
Health misinformation is a subset of misinformation related to health topics. This includes the domain of environmental health that deals with threats resulting from exposure to chemical, biological, or physical factors, such as elevated temperatures and low air quality (Crimmins et al., 2016). These factors, in turn, can cause or worsen health conditions and can also influence the risk of epidemics and pandemics around the globe (Myers, 2017). Acknowledging that human health is situated within environmental systems that are in turn influenced by human action has given rise to a new perspective on health – that is, planetary health (Horton & Lo, 2015). Planetary health is defined as “the health of human civilization and the state of the natural systems on which it depends” (Whitmee et al., 2015, p. 1978). Applying the planetary health perspective, we framed environmental issues as human health issues to provide a more comprehensive view of the psychological impact of misinformation related to health. This follows recent calls for increased collaboration between environmental and health psychology to better address the interdisciplinary nature of health in the Anthropocene era (Inauen et al., 2021). Specifically, we focused on the impact of misinformation on individuals’ behaviors and their antecedents regarding infectious and non-infectious diseases, vaccination, medication, nutrition, tobacco consumption, and environmental issues such as climate change.
Psychological Antecedents of Behavior
Exposure to health misinformation may change health behaviors by influencing beliefs, feelings, and motivations that are relevant to behaviors (Ecker et al., 2022). These relevant beliefs, feelings, and motivations can be grouped into psychological factors, so-called psychological antecedents, that are commonly used to explain and predict behaviors (Ajzen, 1991; Kobbeltvedt & Wolff, 2009). By including psychological antecedents as a potential outcome variable in this review we aim to better understand the potential indirect impact of exposure to health misinformation on health behaviors.
The theory of planned behavior (TPB; Ajzen, 1991) is often used to describe psychological antecedents of health and pro-environmental behaviors because of its predictive power in both domains (Chao, 2012; Hamilton et al., 2020). According to the TPB, behavioral intention (i.e., the perceived likelihood of engaging in behavior) is the strongest predictor of actual behavior. The primary antecedents of intention are an individual’s attitude (i.e., favorable or unfavorable beliefs about a behavior), perceived norm (i.e., belief in what significant others do and motivation to comply), and perceived behavioral control (i.e., belief in the ease or difficulty to perform a behavior) (Ajzen, 1991). Further, feelings and emotions toward a behavior, trust, and knowledge are common extensions of the TPB (Canova et al., 2020; Kobbeltvedt & Wolff, 2009; Shi et al., 2021) that are also mentioned as relevant outcome variables in research on the impact of misinformation (Featherstone & Zhang, 2020; Lee et al., 2020; Pummerer et al., 2022). In this review, we used the constructs of an extended TPB as a framework to analyze the impact of health and environmental misinformation on human behaviors and their psychological antecedents.
Message Features of Misinformation
Despite the usefulness of TPB in structuring the psychological antecedents of behavior, the theory offers little insight into how to counter misinformation that affects these antecedents. Sophisticated debunking of misinformation typically contains explanations of why the misinformation is misleading (Ecker et al., 2022), and research reveals that individuals can be inoculated against the impact of misinformation when they are equipped with counterarguments that uncover common deceptive tactics (Cook et al., 2017; Roozenbeek et al., 2023). This systematic review allows us to identify the message features of health misinformation, which can support future debunking and inoculation interventions to counter misinformation by uncovering common argumentative tactics. Moreover, the analysis of message features allows us to map potential knowledge gaps in understanding tactics of health misinformation and derive directions for further research in this area.
Several frameworks exist to structure the intrinsic message features of misinformation, but many are limited to specific domains, such as vaccination (Kata, 2010). A framework that can be applied to misinformation about health and environmental topics, the so-called FLICC framework, is based on five common rhetorical tactics of science denialism: fake experts, logical fallacies and misrepresentations, impossible expectations, cherry picking, and conspiracy theories (Cook, 2019; Diethelm & McKee, 2008). Thus, the second goal of this review was to analyze message features of health misinformation using the FLICC framework.
Overview
We systematically reviewed the psychological impact and message features of health misinformation. Specifically, we aimed to:
- (1)Synthesize evidence from RCTs that evaluated the impact of exposure to health misinformation on human behaviors or their psychological antecedents.
- (2)Synthesize message features of health misinformation from RCTs that evaluated the impact of exposure to health misinformation.
Methods
A systematic review was conducted to identify RCTs that compare the impact of exposure to health misinformation (i.e., experimental condition) with exposure to either no information or unrelated information (i.e., control condition). RCTs that only compare exposure to misinformation with exposure to interventions (e.g., debunking or factual information) were not included because the effects of those RCTs may result from either the impact of misinformation or the impact of the intervention. We applied a planetary health perspective to the issue of health misinformation; that is, we framed environmental issues as human health issues and included research on environmental misinformation in this systematic review. The data selection and analysis of the systematic review were inspired by the PRISMA approach (Page et al., 2021). The PRISMA flow diagram is reported in the Electronic Supplementary Materials, ESM 1, Figure 1.
Search and Selection Procedure
The literature search was conducted using the following databases: Medline via PubMed, PsychInfo, and Scopus. The generic search terms and the unique search strings for each database are provided in ESM 1 (Tables 1 and 2). The search terms reflected the broader concept of misinformation, including terms related to misleadingness (e.g., misleading information). Furthermore, terms that limit the search to randomized controlled trials (e.g., experiments) and traditional health issues and environmental issues (e.g., vaccination or climate change) were added to the search strings. If applicable, a database filter for the type of publication (i.e., peer-reviewed journal articles) was used during the initial search. No limitations were applied to publication dates.
The database search was conducted on May 23, 2022. First, duplicates were identified using the Mendeley deduplication tool. Duplicates were removed only after manual verification. Two raters (P. Schmid and S. Altay) independently scanned the resulting articles by title and abstract and excluded articles that matched the a priori exclusion criteria (ESM 1, Table 3). Disagreements in ratings resulted in the inclusion of the article for the full-text search. The resulting 141 articles were then analyzed using a full-text search. Again, the a priori exclusion criteria were applied independently by both raters, and disagreements were resolved via discussion. The initial coding sheets of both raters are provided online (Schmid et al., 2022).
Information on the year of publication, sample characteristics (location, size, type, method of recruitment), outcome variables, misinformation stimuli used in the RCTs, and statistical significance of the impact of misinformation on outcome variables were extracted from the final 45 articles. The extracted information is provided in ESM 1, Table 4. All outcome measures were coded using TPB constructs to describe the impact of misinformation on the psychological antecedents of behavior. When the impact of misinformation on a TPB construct was statistically significant, it was coded as a misinformation effect. When the impact of misinformation on a TPB construct was only statistically significant for a subsample, it was coded as a conditional effect. Impacts of misinformation that were statistically significant only for some measures of a TPB construct but not for other measures of the same construct were coded as mixed results. When misinformation caused a statistically significant effect in an unexpected direction (e.g., an increase in knowledge), then its impact was coded as a reversed effect. Moreover, all misinformation stimuli were coded using FLICC to describe the intrinsic message features of misinformation. The procedure of coding outcomes using TPB and coding misinformation stimuli using FLICC is reported in ESM 1, Methods, and Tables 5–8. A summary of the coded information is provided in ESM 1, Table 9.
Results
Descriptive Analysis of Articles
The 45 identified articles cover a range of 41 years of RCTs on the impact of misinformation, with the oldest article being published in 1981 and the peak of publications in 2021 (12/45). The articles reported 64 RCTs (N = 37,552) on the impact of misinformation in total. In 15 RCTs, participants received irrelevant messages in the control group (e.g., information about baseball), and in all other RCTs, participants received no messages in the control group (ESM 1, Table 10). The majority of RCTs focused on the general adult population (48/64). Some RCTs focused on adults with specific conditions, such as being unvaccinated (3/64), being a current or former smoker (4/64), and being a consumer of pain medication (1/64), whereas other studies reported results from adults with a specific profession: students (4/64) and health care workers (2/64). Only a few studies have reported results on the impact of misinformation on children and adolescents (2/64). The largest proportion of research used online samples (60/64) and was conducted in the WHO Region of the Americas (39/64; 95% in the United States), European Region (18/64), African Region (7/64), and Western Pacific Region (1/62).
Several RCTs included evaluations of multiple different types of misinformation. In total, the 64 RCTs included 86 evaluations of the impact of exposure to misinformation (vs. a neutral control group) on psychological variables. In 11 evaluations, the impact of misleading rather than false information was evaluated. These cases remained in the final analyses, given our definition of health misinformation. Four evaluations did not provide enough information about the investigated stimuli. The final set of analyzed evaluations was n = 82 and covered various health domains (Table 1). In the following sections, all reported effects of misinformation are relative to a neutral control group that received either no or unrelated information (ESM 1, Table 10).
Analysis of the Impact of Misinformation
Behavior
The impact of health or environmental misinformation on actual or self-reported behavior was included as an outcome measure in only two evaluations. Specifically, a study among US adults revealed that participants signed a pro-environmental petition less often when exposed to climate change misinformation (van der Linden, 2015). In the second RCT, exposure to vaccination misinformation did not influence the writing of a comment as a response to the misinformation among US adults (Dixon, 2020).
Intention
Behavioral intentions were measured as outcomes in 31 evaluations (Table 1). Intentions were measured in about 44% of evaluations regarding misinformation about conventional health topics and only in about 16% of evaluations regarding misinformation about climate change. Seven evaluations revealed that exposure to misinformation caused lower intentions to perform healthy behaviors. For example, exposure to vaccination misinformation decreased the intention to get a fictitious vaccine for a fictitious child among US adults (Jolley & Douglas, 2014) and decreased the intention to get an HPV vaccine among young Chinese adults (Chen et al., 2021). One evaluation found a conditional effect; that is, the impact of misinformation on the intention to smoke was detectable only across all measures for current smokers but not for former smokers (Gratale et al., 2018).
Fifteen evaluations reported no significant effect of misinformation on behavioral intention on any measure, and six evaluations reported mixed results. Another three evaluations reported reversed effects on behavioral intention. For example, an RCT among healthcare personnel in Germany revealed that exposure to anti-vaccination misinformation increased the likelihood of recommending some COVID-19 vaccines to patients (Priebe et al., 2022).
Attitude
Attitudes were measured in 40 evaluations covering various health domains (Table 1). Exposure to misinformation caused a decrease in positive attitudes toward an object (e.g., drug) or action (e.g., policy support) in 12 evaluations. For example, exposure to vaccination misinformation decreased positive attitudes toward measles, mumps, and rubella (MMR) vaccination among US adults (Featherstone & Zhang, 2020), and exposure to misinformation on nutrition increased positive evaluations of food products among Swiss adult participants and Danish students (Clement et al., 2017; Sütterlin & Siegrist, 2015). Three of these effects were conditional. For example, exposure to anti-vaccination Trump tweets decreased concerns about vaccination among Trump voters but not among non-Trump voters (Hornsey et al., 2020).
Eleven evaluations revealed no significant effect of misinformation on any attitude measure, and 15 reported mixed results. Lastly, two evaluations reported a reversed effect. For example, a misleading nutrition label on food products (i.e., 30% less fat) caused a decrease in positive evaluations of the product (Bialkova et al., 2016).
Perceived Norms
Scales related to norms were used as outcomes in eight evaluations. Norms were measured in about 26% of evaluations regarding climate change and only in about 5% of evaluations regarding conventional health topics (Table 1). Four evaluations found that misinformation had a negative effect on individuals’ perceived norms. For example, exposure to climate change misinformation decreased the perceived scientific consensus among adult samples in the US (Cook et al., 2017; Drummond et al., 2020; van der Linden et al., 2017).
A second experiment by Cook et al. (2017) in the US and a replication of the study in Germany (Schmid-Petri & Bürger, 2022) did not find a significant effect of climate change misinformation on the perceived scientific consensus of adult participants. Further, one evaluation of vaccination misinformation found also no significant impact and one evaluation reported mixed results.
Perceived Behavioral Control
Perceived behavioral control was measured in a single evaluation. A study among young adults in China found no effect of exposure to HPV vaccination misinformation on participants’ perceived behavioral control (Chen et al., 2021).
Feelings and Emotions
The impact of misinformation on feelings and emotions was evaluated by four evaluations (Table 1). One evaluation of the issue of vaccination reported the negative effect of exposure to misinformation across different measures of emotions. Specifically, exposure to vaccination conspiracy theories increased feelings of fear and anger in one RCT among US adults (Featherstone & Zhang, 2020). Three evaluations reported mixed results. For example, in one study, exposure to vaccination conspiracy theories increased feelings of disillusion but not feelings of powerlessness among US adults (Jolley & Douglas, 2014).
Trust
Trust-related outcome variables were measured in eight evaluations (Table 1). Two evaluations reported significant effects, with one of them being a conditional effect. Specifically, exposure to misinformation about COVID-19 decreased trust in institutions among a sample of German students (Pummerer et al., 2022) and exposure to vaccination misinformation decreased confidence in news organizations and credibility of journalists for US adults with low prior belief in vaccine conspiracies (Dixon, 2020). The effect on confidence in news organizations was reversed for US adults with high prior belief in vaccine conspiracies; that is, trust was increased when the media shared misinformation that was in line with the prior beliefs of a vaccine-hesitant audience (Dixon, 2020).
Another study on vaccine misinformation and none of the tests that focused on climate change revealed significant impacts of misinformation on measures of trust.
Knowledge
Knowledge-related measures such as accuracy judgments and belief in false information and facts were the most prevalent variables in evaluations across all outcome measures (Table 1). Exposure to misinformation caused a decrease in accuracy judgments of facts and an increase in belief in misinformation in 23 out of 53 evaluations. For example, exposure to misinformation increased the belief in misinformation surrounding depression treatments, sunscreen, and causes of cancer among US adults (Natoli & Marques, 2021; Porter & Wood, 2022; Vraga et al., 2022). Four of these evaluations were conditional. For example, one study found that exposure to misleading TV advertisements about food products increased misconceptions about the products’ fruit content among children when instructed to pay close attention to the advertisement and when watching nothing but the advertisements. But there was no evidence of this effect when the instruction was absent and when the advertisement was embedded in a real TV program (Ross et al., 1981).
Twenty evaluations of exposure to misinformation did not report any statistically significant effect on knowledge, nine studies reported mixed results, and one study reported a decrease in the belief in vaccination misinformation when exposed to it.
Mixed Measures
Five evaluations contained measures that captured multiple constructs of the expanded TPB in a single score. Three evaluations found that misinformation had a negative effect on these mixed measures with two of them being conditional. One evaluation revealed no significant impact and one revealed mixed results.
Message Features of Health Misinformation
Next, following the second goal of this review, we analyzed the message features of misinformation. Several studies used the same types of misinformation across RCTs. In total, 67 unique misinformation stimuli from various health domains were tested across the 82 evaluations. For the analysis, stimuli were either taken from the article or requested from the authors by e-mail. In 10 cases, the material was either not available in English or there was no response from the authors. In the following section, we apply the FLICC framework to the final 57 unique misinformation (Table 1).
Logical Fallacies and Misrepresentations
Logical fallacies were identified in 41 cases and accounted for all misinformation involved in food product advertisements (Table 1). For example, Sütterlin and Siegrist (2015) showed that individuals perceived food products as healthier when the phrase “fruit sugar” was listed as an ingredient compared with the ingredient labeled “sugar.” Due to the symbolic nature of the term “fruit,” participants fell for a logical fallacy similar to the appeal to nature – the tendency to believe something is good because it is natural (Moldovan, 2018).
Cherry Picking
Selective presentation of data was identified in ten types of misinformation–most of them related to vaccination or climate change issues (Table 1). For example, Porter et al. (2019) presented participants with statements from former US President Trump that worsened the participants’ accuracy judgments of the misinformation that polar ice caps are at an all-time high. In his statements, Trump draws false conclusions about the long-term averages of daily weather (i.e., climate) based on a cherry-picked sample of his personal daily experience (i.e., weather).
Fake Experts
Seventeen examples of misinformation used fake experts to distract from scientific consensus and known health risks (Table 1). For example, van der Linden et al. (2017) found that exposing individuals to a petition stating that human-caused climate change is not happening decreased individuals’ perceived scientific consensus. This so-called Oregon Global Warming Petition is said to have been signed by 31,487 scientists, whereas only around 0.5% of the signatories had a scientific background related to climate change research (Dunlap & McCright, 2010).
Impossible Expectations
Expecting the impossible from science was identified as a deceptive tactic in only two misinformation (Table 1). For example, Featherstone and Zhang (2020) exposed participants to a statement criticizing health professionals for promoting vaccinations without absolute certainty about potential side effects. Exposure decreased positive attitudes toward vaccination and increased anger among US adults.
Conspiracy Theories
Conspiracy theories that accuse public health and environmental agencies of being part of a secret plot were used as misinformation in 17 cases – with the highest proportion being issues related to vaccination, infectious diseases, and climate change (Table 1). For example, Lyons et al. (2019) found that exposure to a conspiracy that claimed the zika virus was intentionally spread via genetically modified mosquitoes increased endorsement of this conspiracy, while Greene and Murphy (2021) found that exposure to a conspiracy about software developers decreased the intention to download a coronavirus contact-tracing app.
Additional Analyses
The 57 misinformation stimuli also differed in their word length (range = 1,127) and in the absolute number of different tactics present in the misinformation (range = 4). Results on whether the impact of health misinformation is a function of the type of FLICC tactic, the number of FLICC tactics, or the word length cannot be meaningfully interpreted in the context of this review because the case numbers for some tactics are too small and possible confounding factors cannot be ruled out. Correlations between impact and message features are reported in ESM 1, Table 11 for transparency.
Discussion
The first goal of this systematic review was to synthesize evidence from RCTs that evaluated the impact of exposure to health misinformation on human behaviors or their psychological antecedents using an extended TPB. Drawing on results from 64 RCTs, including 82 evaluations of misinformation, we found that only two RCTs have measured actual behavior. These RCTs, in turn, included behavioral measures of activism, such as signing petitions. No RCTs evaluated the impact of exposure to misinformation on direct measures of health or pro-environmental behaviors such as vaccination.
All RCTs evaluated the impact of misinformation on psychological antecedents of behavior – as classified by TPB. Many evaluations reported the damaging effect of exposure to misinformation on at least one psychological antecedent of behavior (40/82). That is, many studies reported that exposure to misinformation decreased the intention to perform healthy or pro-environmental behaviors, decreased positive attitudes toward these behaviors, decreased the perception that health behaviors or pro-environmental beliefs are the norms, decreased trust in advocates of science, decreased knowledge, or increased negative feelings toward public health measures.
Other studies did not find any evidence for the impact of misinformation on any of the psychological antecedents or found mixed results, or unintended backfire effects (42/82).
The lack of persuasiveness of misinformation may simply reflect the fact that the persuasiveness of any message is largely context-specific, varying across characteristics of the sender, the receiver, and the message (O’Keefe, 2002). This is supported by studies that reported conditional effects of misinformation in this review (10/40). For example, some studies found that the impact of misinformation was detectable only when its content or source was in line with individuals’ worldviews, when individuals lacked medical expertise, when individuals were current rather than former smokers, or when receivers were instructed to pay close attention (Bolsen et al., 2022; Boudewyns et al., 2021; Gratale et al., 2018; Hornsey et al., 2020; Ross et al., 1981).
In summary, the results of this review reveal that exposure to health misinformation can, at least indirectly, damage healthy and pro-environmental behaviors by influencing relevant psychological antecedents of behavior. These damaging effects, in turn, are highly context-specific, varying across characteristics of the sample, the sender, and the message content.
Message Features of Misinformation
The second goal of this review was to synthesize message features of health misinformation. We found that only very few RCTs have analyzed the impact of impossible expectations or cherry-picking on individuals’ behaviors or their psychological antecedents. Moreover, researchers conducting RCTs often use stimuli that include several different tactics at once. Thus, little is known about the uniqueness of different tactics in explaining the impact of misinformation on individuals’ behaviors or their psychological antecedents.
The results revealed that the FLICC framework sufficiently captured the generic tactics of health misinformation including misinformation that was proven to be damaging for individuals’ psychological antecedents of health behaviors in RCTs. Thus, FLICC can guide the design of interventions that aim to reduce the persuasive impact of a variety of impactful health misinformation by reducing its perceived quality and plausibility.
In line with persuasion theories and research on motivated reasoning (MacFarlane et al., 2020; Petty & Cacioppo, 1986), uncovering flaws in reasoning may be useful for audiences motivated to be accurate but less effective for individuals who are either unmotivated to process information or motivated to derive a specific conclusion. A promising approach to tackling these motivational issues is to frame health or environmental topics in a way that increases the individual’s personal relevance to these topics (MacFarlane et al., 2020). Applying the planetary health perspective in communication approaches may increase the personal relevance of a message by making either the environmental or health benefits of a specific behavior more salient, thereby tailoring messages to the specific values of the audience (Bain et al., 2016). Thus, the planetary health perspective may provide a useful framing approach to tailor interventions against misinformation to audiences that lack the motivation to be accurate.
Knowledge Gaps
The impacts of misinformation on feelings and emotions, trust, perceived norms, and perceived behavioral control were the least studied psychological antecedents of behavior in this review. The lack of RCTs on these outcomes is surprising, given that feelings and emotions, as well as misperceived norms, are known psychological drivers that are often exploited by fraudsters (MacFarlane et al., 2020), and given that emotion regulation is thought to play a central role in explaining science denial (Jylhä et al., 2022). It is also striking that norms were measured particularly rarely in studies on conventional health topics, although, for example, being informed about the actual scientific consensus on the safety of vaccination is a highly promising approach to increasing vaccine uptake among the public (Bartoš et al., 2022). Likewise, intentions were measured particularly rarely in studies on climate change, although intentions are the strongest predictors of actual behaviors (Ajzen, 1991). The existing RCTs that measure the impact of misinformation on the perceived scientific consensus of human-made climate change and the vast number of studies that measure intentions in conventional health domains could provide a good template for examining the issue of norms or intention in the respective other domain.
Furthermore, most of the RCTs on the impact of misinformation addressed the issues of vaccination, infectious diseases, or climate change. Few RCTs were found that evaluated the impact of misinformation about nutrition, non-infectious diseases (e.g., skin cancer), or medication. Moreover, no study was found on misinformation about alcohol consumption, physical exercise, or the use of antibiotics, despite their high relevance for individual and public health. There is also a lack of research exploring the impact of misinformation within the context of health behavior theories, such as the TPB. A single study (Chen et al., 2021) explicitly used the TPB as a theoretical framework to guide the selection of outcome measures for the RCT.
Limitations
RCTs often only present small amounts of information in a very tight timeframe and, due to logistical and ethical challenges, can only measure a limited set of outcomes. The difficulty in measuring behavioral change is reflected by the small number of experiments that include actual behavioral measures. Moreover, it is very difficult to measure the potential ancillary impacts (e.g., influence on public discourse and political agendas) of misinformation in RCTs. The effect sizes in most persuasion research are relatively small (O’Keefe, 2002) and thus large sample sizes are required to detect effects. Yet, most evaluations in this review were based on sample sizes below N = 620, which is required to detect a conventionally small effect size of d = .2 with a statistical power of 80% (t-test; α = .05; allocation: 1:1; one-tailed; see G*Power, Faul et al., 2007). Some misinformation effects reported in this review are based on subsample analyses. Subsample analyses run the risk of increased false positive results and thus correction for multiple comparisons and pre-specified analyses are usually recommended (Brookes et al., 2001). However, in many cases, authors reporting significant effects from subsample analyses did not mention how they handled issues of false positive results or whether subsample analyses were preregistered.
The results of this review come from a limited number of countries in the Global North, and although lab experiments usually have high internal validity, they lack external validity. Moreover, we cannot exclude the existence of a publication bias that could lead to an under-representation of null effects. Finally, the results of a systematic review are limited by factors such as database selection and search string design, that is, we may have missed articles. Thus, this review can only be a building block for understanding the impact of health misinformation.
Conclusion
This review revealed that exposure to health misinformation can damage relevant psychological antecedents of behaviors such as knowledge, attitudes, or behavioral intentions. However, about half of RCTs found no clear effect on antecedents of behaviors. More experimental analyses of context variables, such as characteristics of the receiver or the sender, are needed to understand when health misinformation causes harm. Finally, more conceptual and theoretical work is needed on the causal pathways through which misinformation influences people’s beliefs and behaviors. The negative influence of misinformation may often not be direct, but indirect, and the role of potential mediators such as feelings and emotions, social norms, and trust is not well understood. Moreover, there is a lack of diversity in health misinformation research because most RCTs focus on healthy adult US populations and are conducted online. Finally, using the FLICC framework, we classified message features of health misinformation from 64 RCTs. Results revealed that few RCTs tested the impact of deceptive tactics such as impossible expectations and cherry-picking. The classification may help researchers who want to test the persuasiveness of different message features of health misinformation and may guide the design of future interventions aimed at detecting deceptive tactics.
Electronic Supplementary Materials
The electronic supplementary materials are available with the online version of the article at https://doi.org/10.1027/1016-9040/a000494
Philipp Schmid (PhD) is a Psychologist and a Postdoctoral Researcher at the University of Erfurt. His research focuses on health misinformation and vaccination decision-making.
Sacha Altay holds a PhD in Experimental Psychology and currently works on misinformation and (mis)trust as a Postdoctoral Research fellow at the University of Oxford.
Laura Scherer (PhD) is an Associate Professor at the University of Colorado School of Medicine whose research focuses on risk communication and health decision-making.
References
1991). The theory of planned behavior. Organizational Behavior and Human Decision Processes, 50(2), 179–211. https://doi.org/10.1016/0749-5978(91)90020-T
(2016). Co-benefits of addressing climate change can motivate action around the world. Nature Climate Change, 6(2), 154–157. https://doi.org/10.1038/nclimate2814
(2022). Communicating doctors’ consensus persistently increases COVID-19 vaccinations. Nature, 606(7914), 542–549. https://doi.org/10.1038/s41586-022-04805-y
(2016). The role of nutrition labels and advertising claims in altering consumers’ evaluation and choice. Appetite, 96, 38–46. https://doi.org/10.1016/j.appet.2015.08.030
(2022). Effects of conspiracy rhetoric on views about the consequences of climate change and support for direct carbon capture. Environmental Communication, 16(2), 209–224. https://doi.org/10.1080/17524032.2021.1991967
(2021). Experimental evidence of consumer and physician detection and rejection of misleading prescription drug website content. Research in Social & Administrative Pharmacy, 17(4), 733–743. https://doi.org/10.1016/j.sapharm.2020.06.019
(2001). Subgroup analyses in randomised controlled trials: Quantifying the risks of false-positives and false-negatives. Health Technology Assessment, 5(33), 1–56. https://doi.org/10.3310/hta5330
(2021). Misinformation and public opinion of science and health: Approaches, findings, and future directions. Proceedings of the National Academy of Sciences of the United States of America, 118(15), Article
(e1912437117 . https://doi.org/10.1073/pnas.19124371172020). Buying organic food products: The role of trust in the theory of planned behavior. Frontiers in Psychology, 11, Article
(575820 . https://doi.org/10.3389/fpsyg.2020.5758202012). Predicting people’s environmental behaviour: Theory of planned behaviour and model of responsible environmental behaviour. Environmental Education Research, 18(4), 437–461. https://doi.org/10.1080/13504622.2011.634970
(2021). Effects of vaccine-related conspiracy theories on Chinese young adults’ perceptions of the HPV vaccine: An experimental study. Health Communication, 36(11), 1343–1353. https://doi.org/10.1080/10410236.2020.1751384
(2017). Assessing information on food packages. European Journal of Marketing, 51(1), 219–237. https://doi.org/10.1108/EJM-09-2013-0509
(2019).
(Understanding and countering misinformation about climate change . In I. E. ChiluwaS. A. SamoilenkoEds., Handbook of research on deception, fake news, and misinformation online (pp. 281–306). IGI Global. https://doi.org/10.4018/978-1-5225-8535-0.ch0162017). Neutralizing misinformation through inoculation: Exposing misleading argumentation techniques reduces their influence. PLoS One, 12(5), Article
(e0175799 . https://doi.org/10.1371/journal.pone.01757992016). The impacts of climate change on human health in the United States: A scientific assessment. US GCR Program. https://doi.org/10.7930/J0R49NQX
(2014). The changing public image of smoking in the United States: 1964–2014. Cancer Epidemiology Biomarkers & Prevention, 23(1), 32–36. https://doi.org/10.1158/1055-9965.EPI-13-0798
(2008). Denialism: What is it and how should scientists respond? The European Journal of Public Health, 19(1), 2–4. https://doi.org/10.1093/eurpub/ckn139
(2020). Undermining credibility: The limited influence of online comments to vaccine-related news stories. Journal of Health Communication, 25(12), 943–950. https://doi.org/10.1080/10810730.2020.1865485
(2020). Limited effects of exposure to fake news about climate change. Environmental Research Communications, 2(8), Article
(081003 . https://doi.org/10.1088/2515-7620/abae772010).
(Climate change denial: Sources, actors and strategies . In C. Lever-TraceyEd., Routledge handbook of climate change and society (pp. 240–259). Routledge. https://doi.org/10.4324/9780203876213.ch142022). The psychological drivers of misinformation belief and its resistance to correction. Nature Reviews Psychology, 1(1), 13–29. https://doi.org/10.1038/s44159-021-00006-y
(2007). G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39(2), 175–191. https://doi.org/10.3758/BF03193146
(2020). Feeling angry: The effects of vaccine misinformation and refutational messages on negative emotions and vaccination attitude. Journal of Health Communication, 25(9), 692–702. https://doi.org/10.1080/10810730.2020.1838671
(2018). Influence of Natural American Spirit advertising on current and former smokers’ perceptions and intentions. Tobacco Control, 27(5), 498–504. https://doi.org/10.1136/tobaccocontrol-2017-053881
(2021). Quantifying the effects of fake news on behavior: Evidence from a study of COVID-19 misinformation. Journal of Experimental Psychology: Applied, 27(4), 773–784. https://doi.org/10.1037/xap0000371
(2020). An extended theory of planned behavior for parent-for-child health behaviors: A meta-analysis. Health Psychology, 39(10), 863–878. https://doi.org/10.1037/hea0000940
(2020). Donald Trump and vaccination: The effect of political identity, conspiracist ideation and presidential tweets on vaccine hesitancy. Journal of Experimental Social Psychology, 88, Article
(103947 . https://doi.org/10.1016/j.jesp.2019.1039472015). Planetary health: A new science for exceptional action. The Lancet, 386(10007), 1921–1922. https://doi.org/10.1016/S0140-6736(15)61038-8
(2021). Environmental issues are health issues: Making a case and setting an agenda for environmental health psychology. European Psychologist, 26(3), 219–229. https://doi.org/10.1027/1016-9040/a000438
(2014). The effects of anti-vaccine conspiracy theories on vaccination intentions. PLoS One, 9(2), Article
(e89177 . https://doi.org/10.1371/journal.pone.00891772022). Science denial: A narrative review and recommendations for future research and practice. European Psychologist,
(2010). A postmodern Pandora’s box: Anti-vaccination misinformation on the Internet. Vaccine, 28(7), 1709–1716. https://doi.org/10.1016/j.vaccine.2009.12.022
(2009). The risk-as-feelings hypothesis in a theory-of-planned-behaviour perspective. Judgment and Decision Making, 4(7), 567–586.
(2020). Associations between COVID-19 misinformation exposure and belief with COVID-19 knowledge and preventive behaviors: Cross-sectional online study. Journal of Medical Internet Research, 22(11), Article
(e22205 . https://doi.org/10.2196/222052021). Exposure to health misinformation about COVID-19 and increased tobacco and alcohol use: A population-based survey in Hong Kong. Tobacco Control, 30(6), 696–699. https://doi.org/10.1136/tobaccocontrol-2020-055960
(2019). Not just asking questions: Effects of implicit and explicit conspiracy information about vaccines and genetic modification. Health Communication, 34(14), 1741–1750. https://doi.org/10.1080/10410236.2018.1530526
(2020). Protecting consumers from fraudulent health claims: A taxonomy of psychological drivers, interventions, barriers, and treatments. Social Science & Medicine, 259, Article
(112790 . https://doi.org/10.1016/j.socscimed.2020.1127902018). On appeals to nature and their use in the public controversy over genetically modified organisms. Informal Logic, 38(3), 409–437. https://doi.org/10.22329/il.v38i3.5050
(2017). Planetary health: Protecting human health on a rapidly changing planet. The Lancet, 390(10114), 2860–2868. https://doi.org/10.1016/S0140-6736(17)32846-5
(2021). The antidepressant hoax: Conspiracy theories decrease health-seeking intentions. The British Journal of Social Psychology, 60(3), 902–923. https://doi.org/10.1111/bjso.12426
(2002). Persuasion: Theory and Research. SAGE.
(2021). The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. Systematic Reviews, 10(1), Article
(89 . https://doi.org/10.1186/s13643-021-01626-41986).
(The elaboration likelihood model of persuasion . In R. E. PettyJ. T. CacioppoEds., Communication and persuasion (pp. 1–24). Springer New York. https://doi.org/10.1007/978-1-4612-4964-1_12022). Political misinformation and factual corrections on the Facebook news feed: Experimental evidence. The Journal of Politics, 84(3), 1812–1817. https://doi.org/10.1086/719271
(2019). Can presidential misinformation on climate change be corrected? Evidence from Internet and phone experiments. Research & Politics, 6(3), 205316801986478. https://doi.org/10.1177/2053168019864784
(2022). How (not) to mobilize health workers in the fight against vaccine hesitancy: Experimental evidence from Germany’s AstraZeneca controversy. BMC Public Health, 22(1), Article
(516 . https://doi.org/10.1186/s12889-022-12725-92022). Conspiracy theories and their societal effects during the COVID-19 pandemic. Social Psychological and Personality Science, 13(1), 49–59. https://doi.org/10.1177/19485506211000217
(2023). Countering misinformation: Evidence, knowledge gaps, and implications of current interventions. European Psychologist, 28. https://doi.org/10.1027/1016-9040/a000492
(1981). Nutritional misinformation of children: A developmental and experimental analysis of the effects of televised food commercials. Journal of Applied Developmental Psychology, 1(4), 329–347. https://doi.org/10.1016/0193-3973(81)90014-9
(2019). Science audiences, misinformation, and fake news. Proceedings of the National Academy of Sciences, 116(16), 7662–7669. https://doi.org/10.1073/pnas.1805871115
(2022). Data: The psychological impacts and message features of health misinformation. https://doi.org/10.17605/OSF.IO/4BFZD
(2022). The effect of misinformation and inoculation: Replication of an experiment on the effect of false experts in the context of climate change communication. Public Understanding of Science, 31(2), 152–167. https://doi.org/10.1177/09636625211024550
(2021). Application of the extended theory of planned behavior to understand Chinese students’ intention to improve their oral health behaviors: A cross-sectional study. BMC Public Health, 21(1), Article
(2303 . https://doi.org/10.1186/s12889-021-12329-92022). Misinformation, believability, and vaccine acceptance over 40 countries: Takeaways from the initial phase of the COVID-19 infodemic. PLoS One, 17(2), Article
(e0263381 . https://doi.org/10.1371/journal.pone.02633812021). A unified account of information, misinformation, and disinformation. Synthese, 198(6), 5929–5949. https://doi.org/10.1007/s11229-019-02444-x
(2021). More Doctors Smoke Camels than any other cigarette. https://tobacco.stanford.edu/cigarette/img0077/
. (2021). Prevalence of health misinformation on social media: Systematic review. Journal of Medical Internet Research, 23(1), Article
(e17187 . https://doi.org/10.2196/171872015). Simply adding the word “fruit” makes sugar healthier: The misleading effect of symbolic information on the perceived healthiness of food. Appetite, 95, 252–261. https://doi.org/10.1016/j.appet.2015.07.011
(2015). The conspiracy-effect: Exposure to conspiracy theories (about global warming) decreases pro-social behavior and science acceptance. Personality and Individual Differences, 87, 171–173. https://doi.org/10.1016/j.paid.2015.07.045
(2017). Inoculating the public against misinformation about climate change. Global Challenges, 1(2), Article
(1600008 . https://doi.org/10.1002/gch2.2016000082022). The effects of a news literacy video and real-time corrections to video misinformation related to sunscreen and skin cancer. Health Communication, 37(13), 1622–1630. https://doi.org/10.1080/10410236.2021.1910165
(2019). Systematic literature review on the spread of health-related misinformation on social media. Social Science & Medicine, 240, Article
(112552 . https://doi.org/10.1016/j.socscimed.2019.1125522015). Safeguarding human health in the Anthropocene epoch: Report of The Rockefeller Foundation – Lancet Commission on planetary health. The Lancet, 386(10007), 1973–2028. https://doi.org/10.1016/S0140-6736(15)60901-1
(2020). Call for action: Managing the infodemic. WHO. https://www.who.int/news/item/11-12-2020-call-for-action-managing-the-infodemic
. (