A systematic review of communication interventions for countering vaccine misinformation

Background Misinformation and disinformation around vaccines has grown in recent years, exacerbated during the Covid-19 pandemic. Effective strategies for countering vaccine misinformation and disinformation are crucial for tackling vaccine hesitancy. We conducted a systematic review to identify and describe communications-based strategies used to prevent and ameliorate the effect of mis- and dis-information on people’s attitudes and behaviours surrounding vaccination (objective 1) and examined their effectiveness (objective 2). Methods We searched CINAHL, Web of Science, Scopus, MEDLINE, Embase, PsycInfo and MedRxiv in March 2021. The search strategy was built around three themes(1) communications and media; (2) misinformation; and (3) vaccines. For trials addressing objective 2, risk of bias was assessed using the Cochrane risk of bias in randomized trials tool (RoB2). Results Of 2000 identified records, 34 eligible studies addressed objective 1, 29 of which also addressed objective 2 (25 RCTs and 4 before-and-after studies). Nine ‘intervention approaches’ were identified; most focused on content of the intervention or message (debunking/correctional, informational, use of disease images or other ‘scare tactics’, use of humour, message intensity, inclusion of misinformation warnings, and communicating weight of evidence), while two focused on delivery of the intervention or message (timing and source). Some strategies, such as scare tactics, appear to be ineffective and may increase misinformation endorsement. Communicating with certainty, rather than acknowledging uncertainty around vaccine efficacy or risks, was also found to backfire. Promising approaches include communicating the weight-of-evidence and scientific consensus around vaccines and related myths, using humour and incorporating warnings about encountering misinformation. Trying to debunk misinformation, informational approaches, and communicating uncertainty had mixed results. Conclusion This review identifies some promising communication strategies for addressing vaccine misinformation. Interventions should be further evaluated by measuring effects on vaccine uptake, rather than distal outcomes such as knowledge and attitudes, in quasi-experimental and real-life contexts.


Introduction
The COVID-19 pandemic has highlighted the ongoing public health challenge of vaccine hesitancy and vaccine refusal, and the misinformation, disinformation, and myths that feed into them. Even prior to the current pandemic, the World Health Organization (WHO) identified vaccine hesitancy -defined as the delay in acceptance or refusal of vaccines, despite their availability [1] as a top global health threat [2]. While vaccine controversies in recent years have underscored this threat, the COVID-19 pandemic has thrown a spotlight onto it.
Hesitancy is context dependent and varies greatly both within and between countries as exemplified by the COVID-19 vaccination [3]. Several models explain the varied components and antecedents of vaccine hesitancy, including the '3C' model (confidence, complacency, and convenience) which has been expanded upon by the more recent '5C' and '7C' models (confidence, complacency, constraints, calculation, collective responsibility, and conspiracy), and the '5A' taxonomy of vaccination determinants (access, affordability, awareness, acceptance, and activation [1,[4][5][6].More complex and context-specific frameworks have also been developed to conceptualize the factors influencing vaccine hesitancy and uptake, such as the Adapted Royal Society of Canada Vaccine Uptake Framework and the SAGE Working Group's Vaccine Hesitancy Determinants Matrix, which underscores the complexity and multitude of factors that can contribute to vaccine hesitancy, including the communication and media environment, knowledge/awareness, and the attitudes and experiences of family and health professionals [1,7]. Misinformation intersects with many potential determinants and can undermine vaccine confidence, defined in the 3C's model as ''trust in (i) the effectiveness and safety of vaccines; (ii) the system that delivers them, including the reliability and competence of the health services and health professionals and (iii) the motivations of policy-makers who decide on the needed vaccines" [1].There has been an increase in health-related mis-and dis-information in recent years, with vaccines and infectious diseases a major focus [8]. Misinformation is inaccurate information that is unintentionally presented as fact, while disinformation involves deliberately spreading false information to cause harm. In the current era of 'fake news' and ubiquitous use of the internet and social media, increasing circulation of and exposure to mis-and disinformation has been linked to higher levels of persistent vaccine hesitancy despite some social media platform algorithms attempting to minimize biases and the spread of mis and disinformation [9][10][11][12][13].
Developing and tailoring effective health communications and campaigns to counter vaccine misinformation is essential, both for maximizing vaccine uptake as the COVID-19 pandemic continues and for ensuring high uptake of other existing and future vaccines to prevent and control disease outbreaks. Understanding the various strategies that have been tested or used to counter vaccine misinformation and disinformation and identifying those that are potentially effective (as well as those that work less well or backfire) is crucial in order to tackle vaccine hesitancy. Previous systematic reviews [14][15][16][17][18][19] have examined interventions for addressing vaccine hesitancy and correcting misinformation of various health topics, but to our knowledge, no review has specifically examined communication-based strategies for countering vaccine misinformation. 1) identify and describe communication-based strategies that have been used to prevent or ameliorate the impact of vaccine misinformation and disinformation on people's attitudes and behaviors surrounding vaccination. 2) identify strategies which appear to be effective in countering vaccine mis-and disinformation.
The study was registered with PROSPERO (CRD42021243341) and is reported in accordance with PRISMA guidance [20]. Changes from protocol are briefly noted in the text.

Search strategy
We searched CINAHL, Web of Science, Scopus, MEDLINE, Embase and PsycInfo databases using a search strategy developed in consultation with an information specialist (see Appendix A for search strategy used in MEDLINE). Search strategies from previously published reviews on related subjects [21,22] were used as a reference to develop comprehensive lists of search terms, which were grouped around three themes: (1) communications and media, (2) misinformation, disinformation, myths, etc, and (3) vaccines. Searches were conducted in March 2021 with no limits on publication date. To capture current research on the COVID-19 pandemic, we also searched MedRxiv, a non-commercial preprint online repository, using truncated sets of the search terms in Appendix A.

Study eligibility and selection
Citations were imported into a Rayyan online library [23]. Studies were screened for eligibility based on inclusion and exclusion criteria described in Box 1. Title and abstract screening was conducted by two reviewers independently, with disagreements resolved by a third reviewer. Articles selected for full text screen-ing were exported to an EndNote library. One reviewer conducted full text screening, with uncertainties resolved by a second reviewer.

Data availability
No data was used for the research described in the article.

Data extraction
Data were extracted by one reviewer using a pre-defined Excel database. Data extracted included publication descriptors, disease or vaccine of focus, study design, sample characteristics, comparator group information, description of the intervention, whether intervention effectiveness was assessed, and key results, including which, if any, outcomes were reported. Relevant outcomes were any author-defined measure of attitudes towards vaccines or vaccine-related information (including vaccine hesitancy), accurate knowledge of vaccines, belief in misinformation, and intention to vaccinate.

Data synthesis
For objective 1, the primary approach to data synthesis was grouping interventions by type or format (i.e., pamphlet, social media-based intervention, informational text) and content or communication strategy/approach (i.e., informational, myths vs facts approach, use of humor and logic) and summarizing intervention design by group. Groupings were developed inductively, following data extraction, and refined throughout synthesis.
For objective 2, study outcomes were iteratively grouped and ultimately classified into (1) belief in or endorsement of mis-or disinformation, (2) knowledge about vaccines and related diseases, (3) attitudes and beliefs about vaccination, including measures of vaccine hesitancy, and (4) intention to vaccinate. Data from comparative studies was summarized in tabular form, stratified by intervention strategy/approach and outcomes to develop an effect direction plot [25]. meta-analysis was considered inappropriate due to wide variation in both the type/format of interventions and content or communication strategy/approach.
For brevity throughout this review, we use the term 'misinformation' as an umbrella term to refer to false information, myths, and misperceptions, as the focus of the included studies and this review is on interventions rather than assessing the source and intent of false information.

Risk of bias
The RoB 2 tool [26] was employed to assess randomized controlled trials (RCTs) (n = 25), and one cluster-randomized trial was assessed using the RoB 2 tool for cluster-randomized trials [27]. Two reviewers independently assessed each study. Disagreements were discussed and resolved accordingly. The robvis web application [28] was used to generate a summary risk-of-bias plot.

Results
After deduplication, 1800 unique results were returned from the database searches. An additional 200 articles from MedRxiv were screened from which 6 further articles were identified. A total of 142 articles were selected for full-text screening (Fig. 1).
In total, 34 studies, including four preprints, were eligible for inclusion. Study characteristics are described in Table 1 one quasi-randomized trial), six single-arm studies, two narrative descriptions of interventions or intervention development, and one website content analysis. Most were conducted in the United States (n = 21), followed by the United Kingdom (n = 6), and Italy (n = 2). The remaining studies were conducted in Germany, France, Canada, Israel, and Singapore (1 study each). Three studies included respondents from multiple countries. One study did not specify location [29]. All but two studies were published within the past decade. 23 studies addressed specific vaccines and/or diseases (most commonly measles, mumps and rubella (MMR) and influenza), while nine studies considered vaccines generally and two utilized fictitious diseases and vaccines.
Primary outcomes of interest reported on included vaccinerelated beliefs, knowledge, and attitudes, and specifically, intention to vaccinate (though this was often assessed as a hypothetical intention). None of the articles measured vaccine uptake or coverage.

Objective 1 -Description of interventions
The interventions presented in the 34 included studies ranged widely, both in form and in messaging strategies. We outlined both the format and content and messaging approaches below while the format of intervention is described in more details in able. Study characteristics are summarized in Table 1, with additional detail   on sample characteristics in Appendix Table A1, and the distribution of intervention strategies and approaches is displayed in Table 2.

Thematic analysis of communication strategies
While the included interventions varied widely in their form, thematic similarities in approach and content emerged. These approaches or strategies emerged as intervention characteristics were extracted, and inductively grouped and re-grouped. Table 2 displays the heterogeneity in interventions and communication strategies' form and approach. Nine 'intervention approaches' were identified; most focused on the content of the intervention or message (debunking/correctional, informational, use of disease images or other 'scare tactics', use of humor, message intensity, inclusion of misinformation warnings, and communicating weight of evi-before screening  Table 1 Characteristics of included studies.   dence), while two focused on the delivery of the intervention or message (message timing and message source).

Debunking, correction, and refutational approaches
Many studies utilized or assessed different types of refutational or debunking messages or approaches, in which misinformation or myths were addressed head on and corrected. This was done in various forms, including pamphlets and websites that laid out common vaccine myths and corresponding corrective facts [42][43][44], while other studies tested the effects of refutational texts as compared to expository, informational texts [45] or compared responding to misinformation on social media with different types or sources of debunking or refutational messages [35,37]. A community nursing program aiming to address measles vaccine hesitancy [40] included aspects of misinformation debunking and refutation, including developing a pamphlet directly countering a disinformation pamphlet that had circulated in the community. Bode & Vraga [46] tested the use of Facebook's 'related news' function to respond to posts containing misinformation by showing related articles that either confirmed or refuted misinformation about the vaccine-autism link. Trujillo et al's study [33] assessed the impact of articles that utilized 'psychographic microtargeting' to address misinformation about the MMR vaccine.

Informational approaches
While several included interventions or communication approaches focused on refuting or correcting misinformation, another large group examined communication strategies that took a more informational or educational approach, focused on providing information on vaccines and vaccine-preventable diseases in a factual manner (rather than directly addressing or debunking myths and misperceptions), though this was done through both direct (i.e. listing facts) and less direct approaches (such as conveying information through a story). The informational/educational interventions, which were often informed by or aiming to address misinformation, included a fotonovela [47], pamphlets [30,48], informational government-run websites [44], videos [49,50], PowerPoint presentations for new parents [51], manipulating search engines to provide information on vaccines [52], community-based educational programs [39,40], virtual, interactive dialogues that provide information addressing key vaccine concerns in an empathetic way intended to foster behavior change [41], and different types of informational messages, which range from providing information on the risks of vaccine-preventable diseases [53][54][55] or explaining how vaccines work or are approved [56].

Use of disease images & other 'scare tactics'
Two studies tested the use of 'disease images' -photos of children experiencing severe symptoms of vaccine preventable diseases -as an intervention, aiming to shock or scare participants into changing their attitudes [42,54]. A similar strategy described by two papers was employing a dramatic narrative of a child's experience with a disease or graphic written descriptions of disease symptoms [33,54].

Use of humor
Three studies examined the use of humor in messages correcting or criticizing vaccine misinformation. One [57] tested a humorous, satirical text critiquing avoidance of MMR vaccination against a non-humorous version, while two studies [36,38] examined the effect of humor-based and logic-based corrections to misinformation tweets about the HPV vaccine.

Strength and certainty of message
Three studies examined the effect of communicating vaccine information with differing levels of intensity or uncertainty. Batteux et al [34] tested hypothetical government announcements that communicated either with certainty or uncertainty about COVID vaccine effectiveness. Similarly, Kerr et al's study [56] included an experiment trialling no caution, medium caution, and high caution messages about COVID vaccine safety and efficacy and the need to maintain protective behaviours after vaccination. Betsch and Sachse [58] also explored message intensity in their experiment, which assessed strongly or weakly negating either two or five typical antivaccination misinformation statements.

Misinformation warning
Two studies examined the effect of communications that alert or train individuals about the potential for misinformation. Tully et al [59] explored doing so on Twitter by including a tweet about spotting misinformation within a simulated Twitter feed that included misinformation tweets about influenza, while Ludolph et al's experiment [52] tested the effect of including a warning message about misinformation appearing in search results during Google search. Relatedly, educating and training parents to identify misinformation and to critically evaluate evidence was an aspect of the community nursing program described by Marcus [40].

wt. of evidence
Two studies, both considering how to address the misperception of vaccines causing autism, assessed the impact of communicating the weight of evidence surrounding this misperception. Dixon et al [31] compared articles that provided either a onesided argument against the (false) autism-vaccine link, a 'false balance' article that presented arguments both for and against the link, and weight of evidence articles that presented both arguments but included statements underscoring the lack of both scientific evidence and consensus among scientists supporting a link between autism and vaccines. A study by van der Linden [60] similarly focused on the lack of scientific consensus supporting the supposed autism-vaccine link, using both written messages and pie charts to indicate scientific consensus that vaccines are safe.

Message source
Several studies manipulated the source of the message/intervention to identify the most effective 'messenger', rather than (or in addition to) the message content, considering factors such as the gender, credibility or trustworthiness of the message source [37,50,58,61].

Message timing
One study assessed the effect of the timing of interventions addressing misinformation. Jolley & Douglas [32] assessed whether reading an article with anti-conspiracy arguments about vaccines generally was more effective prior to ('inoculating') or following ('debunking') exposure to misinformation.

Objective 2: Identifying potentially effective strategies
To identify potentially effective communication strategies, studies were grouped by the above approaches and findings were tabulated in an effect direction plot [25] (Table 4). A summary of each approach's effect is displayed in Table 2. Five studies [40,41,44,48,51] did not assess or measure outcomes, and were excluded. Intermediate outcomes, such as perceived reliability or accuracy of messages or message sources, were not examined in this study. Results were mixed, with no intervention type showing clear and consistent positive or negative results across outcomes.  Table 3 Summary of communication interventions to address vaccine misinformation, by intervention format.

Print medium interventions
Pamphlets or booklets aimed to address misconceptions and increase knowledge about vaccines and related diseases using approaches such as question-and-answer formats [26], presenting common myths alongside established evidence countering the myth [38,39], or simply providing simple, comprehensible information statements about vaccines [40]. Study [41] described the development of a fotonovela (similar to a comic book, but using photographs) designed to address myths about the HPV vaccine. Another paper described an evidence-based vaccine magazine developed in response to a locally circulating anti-vaccine booklet that had been circulated in the same community [36].
Video & multi-media Two studies assessed video interventions, while one trialled a voice-over PowerPoint presentation. One video took a corrective, factbased approach to addressing common misconceptions about vaccines [42], while the second explained how COVID-19 mRNA vaccines work [43]. The PowerPoint addressed common concerns surrounding infant vaccines and was shown to vaccine-hesitant parents of infants at their first pediatrician visit [44].
In-person interactive interventions Studies described in-person interactive educational interventions, focused on particular vaccines and targeted at specific groups. Ho 2017 [35] assessed the impact of a series of interactive sessions run by 'Health Ambassador' volunteers with senior citizens in Singapore that aimed to improve knowledge and attitudes about influenza, pneumonia, and their vaccines. Marcus 2020 [36] described different strategies used in a grassroots community-based nursing program that aimed to counter widespread misinformation about measles and the measles vaccine among the New York Orthodox Jewish community, including developing an evidence-based magazine for parents, hosting workshops, vaccine fairs, and provider trainings

Social media strategies
Several studies examined social media-based strategies and interventions. These were mostly trialled through experimental simulations, such as mock Facebook or Twitter feeds. Two studies tested responding to a misinformation tweet with a humor-based or logic-based correction tweet [32,34]. Similarly, two studies tested different comments responding to Facebook posts including vaccine-related myths or misinformation. Gesser-Edelsburg et al [31] trialled a brief, unequivocal message emphasizing MMR vaccination requirements as well as a lengthier, more empathetic response that provided information responding to vaccine fears and concerns; Sullivan [33] assessed corrective messages posted as Facebook responses, but manipulated the message source, testing different sources such as libraries and the CDC. Social media-based strategies also included testing Facebook's 'related links' functionality by following a post containing misinformation with links to two related stories that confirm and/or refute it [45] as well as a more proactive strategy of exposing participants to a 'news literacy' tweet about how to spot fake news within a simulated Twitter feed that also includes a tweet linking to a false story about the flu vaccine causing the flu [46].

Other web-based strategies & interventions
Several web-based interventions or strategies that did not involve social media were among the included studies. Ludolph et al [47] conducted an experiment in which experimenters manipulated Google search results to include an information box containing either basic or more difficult to comprehend information about vaccination, and/or a warning that 'false or misleading information' may be encountered during their search. Knight et al [37] described the development of a digital, web-based intervention -a 'chat bot' -that used interactive 'therapeutic dialogues' aimed to address COVID-19 vaccine hesitancy. An experiment by Betsch & Sachse [48] tested websites addressing vaccine-adverse effects; the experiment manipulated the extremity of risk negations as well as the source of the negation by showing information on the websites of a trusted government health institution or that of a pharmaceutical company.
Another study conducted a content analysis of government websites informing the public about vaccines, and found debunking and answering common questions to be the most commonly used approaches [49].

News articles
Four studies experimented with different communication strategies countering misinformation within mock news articles that study participants read within online surveys. Stojanov (2015) examined the effect of countering general anti-vaccine misinformation with articles that either solely provided debunking information or debunking information alongside information explaining the conspiracists' motivation and the fallacies in the anti-vaccine information. Another explored ''psychographic microtargeting" by developing fictional news articles that aimed to address MMR vaccine misinformation by targeting psychological traits thought to be linked to vaccine attitudes [29]. Dixon et al [27] tested how different journalistic approaches in covering the autism-vaccine link, including a false balance framing and articles that presented the weight of evidence against the link, affects personal beliefs about the link. Lastly, Jolley & Douglas [28] examined the timing of exposure to anti-conspiracy articles rather than message content; this study assessed the effects of exposure to corrective information, in article form, before or after exposure to misinformation presented in articles supporting general vaccine conspiracy theories.

Visual communications
Several studies explored communication strategies that utilized visuals such as images or charts to address misinformation. Pluviano et al [38] compared 3 approaches, including a 'visual correction' utilizing infographics to compare the risks of measles, mumps and rubella with the risk of vaccine side effects, as well as a 'fear correction' showing graphic pictures of unvaccinated children infected with the diseases. The use of graphic images of infected children was also tested by Nyhan et al [50], while two studies utilized visuals to convey medical consensus or the weight-of-evidence behind accurate vaccine information, with van der Linden et al [51] using pie charts that conveyed medical consensus that vaccines are safe ('descriptive norm') or that parents should be required to vaccinate their children ('prescriptive norm'), while Dixon et al [27] used photos of either a single or group of scientists within a fictional article. The fotonovela project [41] also employed the use of images.

Message manipulations
Pluviano et al [52] manipulated message source, testing the effects of corrective messages about a fictional disease and vaccine coming from sources deemed to be of high and low expertise and trustworthiness, while Kerr et al [53] assessed message length, testing the effect of short and long messages, as well as manipulating message content and trialling messages that communicated varying levels of caution about COVID vaccine efficacy. Most of these studies focused on message content or framing. Kerr et al [53] compared various different messaging strategies to communicate information and address concerns and misinformation about COVID-19 vaccines, including a 'factbox' format presenting benefits and harms, question-and-answer format, a message explaining the vaccine approval processes, and a ''scientific mechanism message" explaining how mRNA vaccines work. Batteux et al [30] conducted an experiment on messaging around the COVID-19 vaccine, testing hypothetical government announcements that communicated either with certainty or uncertainty about the effectiveness of vaccines that were under development. Nyhan     Negative effect (worsen knowledge/attitudes/intention to vaccinate; increase believe in misinformation) ▲ <> .
Sample size: n>500 Comparison groups: C: Comparator is control group; P: Pre/post comparator; I: Intervention arms compared to one another Notes 1 No clear effect on vaccine hesitancy (measure was 3-item composite measure of concern about MMR vaccine side effects), but improved general attitudes (3-item composite measure of attitudes towards having child receive MMR vaccine) 2 In subgroup analysis (low and high side effects concern), correction intervention significantly reduced false beliefs and safety concerns only for low side effects concern group. For intention to vaccinate, subgroup analysis found that correction intervention decreased intention among high side effects concern group. 3 Vaccine hesitancy measure was single item on perceived likelihood of MMR vaccine side effects 4 In subgroup analysis, correction message only significantly decreased intention to vaccinate among respondents with least favorable vaccine attitudes prior to intervention. 5 Belief in misinformation and intention to vaccinate significantly worse compared to control at time 2 (7 days after intervenion) only, and not at time 1 (immediately after) 6 Belief in misinformation and vaccine hesitancy significantly worse compared to control at time 2 (7 days after intervenion) only, and not at time 1 (immediately after) 7 Vaccination attitudes measure was 3-item composite measure of agreement with statements about vaccines causing autism, frequency of MMR vaccine side effects, and intention to vaccinate 8 Attitudes/beliefs measured with adapted versions of perceived efficacy and public importance subscales of Oxford COVID-19 vaccines beliefs scale; estimated % of cases prevented for additional measure of perceived vaccine efficacy. Vaccine hesitancy measured using 7 item Oxford Covid-19 Vaccine Hesitancy Scale. No caution messages (both short and long) improved attitudes/beliefs measure, but had no effect on vaccine hesitancy. 9 Vaccination attitudes measure was a 3-item composite measure of agreement that vaccination protects against and reduces risk of whooping cough   Five studies were assessed to be at high risk of bias, and the remaining 20 were judged to have some concerns for risk of bias (Fig. 2). Four uncontrolled before-and-after studies [30,39,47,49] were not formally assessed, but their findings are discussed.

Debunking/refutational interventions and messaging
Interventions and messaging that focused on debunking or refuting misinformation had mixed results; while three debunking interventions across four studies were found to lessen belief in misinformation, two interventions using myths vs facts pamphlets worsened belief in misinformation and another two approaches had no clear effect on belief in misinformation. 1 Only one small study examined knowledge as an outcome and found no clear effect [45]. Most studies on debunking interventions examined the effect on reported intention to vaccinate or broader measures of vaccine attitudes, beliefs, and hesitancy. Five debunking interventions had mixed or no clear effect on intention to vaccinate, and 2 experiments found debunking messaging to decrease vaccination intentions among some respondents (those with more hesitant attitudes towards vaccination at baseline) [53,54]. Most debunking interventions had mixed or no clear effect on broader vaccine attitudes. One experiment with two different refutational messages found that both tested strategies improved general attitudes towards the MMR vaccine, but had no clear effect on vaccine hesitancy, which was measured as concern over vaccine side effects [62]. Another found that a corrective message reduced false beliefs and safety concerns, but subgroup analysis revealed that this effect held only among respondents with less concern about side effects at baseline [53]. One of the studies testing a myths vs facts booklet found that the intervention worsened beliefs about MMR vaccine side effects [43].

Informational interventions/communication strategies
Informational interventions seemed to generally improve vaccine-related knowledge, with all four studies that assessed knowledge as an outcome reporting an improvement, though 3 of the 4 had very small samples [30,39,47,49]. Most (n = 9) of the informational interventions had mixed results or no clear effects on vaccine attitudes, though a question-and-answer pamphlet [30] and an interactive educational program [39] were found to improve attitudes, and an informational video, when narrated by a male voice, was found to increase reported intention to vaccinate [50]. The effect on belief in misinformation was assessed for five of the informational interventions with mixed or no clear effects found for four of the five [33,42,53,54]; however, a fictional news article providing information about needle-free methods for vaccine administration was found to reduce endorsement of misinformation among individuals high in 'needle sensitivity' or anxiety and discomfort related to needles and blood [33].

Disease images & other 'scare tactics'
Communications utilizing disease images and other dramatic approaches were largely ineffective. The interventions using photos of children experiencing severe symptoms of vaccinepreventable diseases worsened or had no clear effect on belief in misinformation and vaccine attitudes. The similar tactic of using a dramatic narrative about an infant almost dying of measles also backfired, increasing the perceived risk of severe vaccine side effects [54]. However, one study found that a news article including a parent's graphic description of their child's experience with measles decreased endorsement of vaccine-related misinformation among individuals high in 'moral purity', a psychological trait in which decision-making is driven by feelings of disgust and trying to avoid contamination and disease [33].

Humor
Three studies examined the use of humor in messages correcting or criticizing vaccine misinformation. One assessed vaccine hesitancy, and found both humorous and non-humorous messages reduced vaccine hesitancy, but were more effective for different populations. The humorous, satirical message reduced vaccine hesitancy among participants that held more false vaccine beliefs at baseline, while the non-humorous message was found to be more effective among respondents that already held positive vaccine beliefs [57]. Two HPV vaccine studies [36,38] found that both humor-based and logic-based corrections reduced misperceptions, but that the logic-based corrections were found to be more credible and were more effective at reducing misperceptions.

Message intensity
'Stronger' messaging -negating a risk more strongly or communicating with more certainty -seems to backfire, with 2 studies finding such messaging to worsen vaccine hesitancy and intention to vaccinate, while communicating with uncertainty or weaker risk negations improved vaccine hesitancy and intentions relative to the stronger messages [34,58]. A third study found mixed results as to the impact of communicating the effectiveness of COVID-19 vaccines with varying levels of certainty on vaccine hesitancy, intentions, and beliefs; for example, messages that didn't caution that protective measures would be needed post-vaccination were associated with higher perceived vaccine efficacy, but this had no effect on vaccination intentions [56].

Misinformation warning
Two studies assessed strategies alerting participants to the potential for misinformation on Twitter [59] and during a Google search [52]. This appears to be a potentially promising approach; Tully et al [59] found belief in misinformation to decrease with a misinformation warning, while Ludolph et al [52] found that combining simple, comprehensible information about vaccines with a misinformation warning improved participant knowledge and attitudes; however, the effects of the misinformation warning alone are less clear.

Communicating weight-of-evidence/scientific consensus
Communicating the weight of evidence surrounding vaccinesthat is, explaining which standpoint is supported by evidence and scientific consensus -may be a promising strategy. Two studies found that belief in misinformation was reduced when communicating weight of evidence alongside a visual exemplar (photo of scientist(s) for Dixon et al [31]; pie charts highlighting medical consensus for van der Linden et al [60]). Communicating weight of evidence without a visual exemplar had no effect on belief in misinformation [31], indicating the visual exemplar is a crucial piece of the intervention. Vaccine hesitancy and attitudes towards vaccines were also improved in van der Linden's experiment with messages highlighting medical consensus [60].

Timing of message
Only a single study specifically tested the timing of message interventions to see whether they are more effective prior to or following exposure to misinformation. Jolley & Douglas [32] found that anti-conspiracy arguments improved correct attitudes and increased intention to vaccinate when presented before participants read a conspiracy article, but not if reading it afterwards as a debunking measure. 1 Interestingly, in Bode & Vraga's study, while a debunking approach on Facebook of linking to articles that refuted misinformation had no effect on respondent attitudes, the same approach was successful for reducing misperceptions about a link between GMOs and illness. 3.4.9. Message source Message source seems to be a salient factor in vaccination communications. Pluviano et al [61] found that misinformation corrections from sources deemed to be high-and low-expertise or trustworthiness did not affect vaccination intent, though corrections from high and low-expertise and high-trustworthiness sources decreased participants reliance on the misinformation; the authors concluded that trustworthiness of a message source was more important than expertise. Sullivan [37] tested refutational messages on Facebook from four different sources; onethe American Library Association -actually resulted in lower vaccination intentions, while messages from the CDC or another Facebook user resulted in reduced vaccine misperceptions. Two studies considered how message source interacts with other factors; Witus & Larson [50] found that while a male-narrated informational video led to improved vaccination intentions across the sample, the same video, when narrated by a female voice, was associated with decreased vaccination intentions among politically conservative participants. Betsch & Sasche [58] looked at message source and intensity together, and found that for a non-credible source, a message with weaker risk negations was more effective at improving vaccination intentions and hesitancy, while for credible sources, the intensity of risk negation communicated made no difference.

Discussion
Vaccine hesitancy is a key public health concern, contributing to disease outbreaks and interfering with epidemic control. Exposure to misinformation has been shown to decrease intent to accept a vaccine [13], highlighting the importance of identifying strategies that can effectively counter misinformation. This review demonstrates the wide range of communication strategies that have been tested and implemented to date, using a variety of formats -from interactive, in-person educational sessions to social media-based corrections of misinformation -and messaging approachesincluding debunking, informational messages, using humor and manipulating the source or intensity of messages. Some strategies seem to be clearly ineffective: employing scare tactics, such as graphic images of children infected with vaccine preventable diseases, increased belief in misinformation [42,54]; communicating about vaccines with certainty -rather than acknowledging uncertainty around vaccine efficacy or risks -was also found to backfire [34,58]. Overall, promising communication studies include communicating the weight of evidence and scientific consensus around vaccines and related myths, utilizing humor, and incorporating warnings about encountering misinformation. The effects of trying to debunk misinformation, informational approaches, and communicating uncertainty may help with some outcomes, but have mixed results and should be investigated further.
The most common approaches taken by the reviewed studiesdebunking and informational messaging -had very mixed effects across the considered outcomes. This is consistent with research that has shown that providing a dissenting message could backfire and reinforce pre-existing beliefs of a particular group [63]. One study however showed that ''pre-bunking" that involved anticipating the 'myth' and providing the anti-conspiracy message in advance could however increase intention to vaccinate [32], in line with effective prophylactic approaches identified elsewhere [64].
Heterogeneity was also found within studies, with some trials finding that a certain intervention was effective at improving attitudes or vaccination intent among some sub-populations but not others. This underscores the complexity of vaccination behaviors and the myriad factors that can influence attitudes, hesitancy, and decision-making [65]. More actionably, this heterogeneity highlights the need for communication strategies and interventions to be chosen and tailored in audience-specific ways. Political views, shown to be associated with COVID-19 vaccination rates [66], are important to consider for both the targeting and tailoring of communications interventions; for example, Witus & Larson [50] found that among politically conservative individuals, an educational video about COVID-19 vaccines could increase vaccination intentions if narrated by a male, but that there was a backfire effect with a female narrator; this difference wasn't seen among politically liberal participants. Strategies may work better or worse among individuals with different baseline attitudes towards vaccination or belief in misinformation; for example, Moyer-Guse et al [57] found that humor-based misinformation corrections were more effective at reducing vaccine hesitancy among participants holding more false vaccine beliefs, while non-humorous messages were more effective for those who already had positive beliefs towards vaccines. These findings illustrate that the design of interventions and communications strategies need to consider who, specifically, they are targeting and are likely to reach, and underscore the need for future trials to conduct rigorous sub-group analyses and understand the baseline beliefs and attitudes of respondents.
The body of evidence generated by this systematic review points to several important considerations for its interpretation and for future research. Firstly, many of the studies were randomized trials conducted as simulated situations unlikely to reflect how a communication strategy would be implemented in reallife; a messaging approach that works when an individual reads an article embedded in an online survey experiment may not have the same effect when translated into a mass media campaign. Only a handful of studies examined actual interventions as they would be realistically delivered, and most of these did not rigorously assess effectiveness. Additionally, none of the studies directly measured the effect of interventions on actual vaccine uptake, though many surveyed participants about vaccination intention. Future research in this arena should aim to conduct real-world piloting of interventions and measure resulting changes in uptake of vaccines, in addition to indicators such as vaccination intention, knowledge and belief in misinformation. Lastly, research must keep up with the ever-changing media landscape through which misinformation propagates; future studies should consider how to address misinformation spreading on other social media platforms, such as Instagram, WhatsApp, and TikTok [67], including how modifying search algorithm, ranking different type of evidence and sources of information, and reporting online misinformation may help address disinformation [68].
Our findings contribute to and build upon the existing body of literature on interventions for improving vaccination rates. A previous review [69] found that text messaging interventions, primarily for vaccine appointment scheduling and/or reminders, and immunization campaign websites were among the potentially effective 'new media' approaches to increasing immunization coverage; avenues such as incorporating humorous corrections of myths or sharing misinformation warnings as a part of text messaging interventions should be explored. A meta-analysis of social media-based health misinformation corrections found that debunking is an effective strategy -but that correcting misinformation related to infectious diseases is more difficult, which may explain the more heterogeneous effects found in this review [19]. A 2015 review of reviews [18] echoes our findings of there being a lack of strong evidence to specifically recommend any interventions; the authors noted that educational tools, such as pamphlets, had little or no effect on vaccine hesitancy, and highlights the need to understand groups' specific concerns, rather than seeking an intervention that will be universally effective. This review has some limitations. Firstly, most studies were conducted in the United States, and thus findings should be applied to other settings with caution. Secondly, the literature search was conducted approximately-one year into the COVID-19 pandemic; future research should aim to assess more recent and long-term assessments of communication interventions, and search additional databases. Our searches were restricted to studies published in the English language and we did not search for grey literature. Notably, previous studies indicate that the exclusion of non-English articles has little effect on systematic review results [70,71].Data extraction was performed by a single study member, rather than cross-checked by a second team member as originally planned. We were also constrained by the body of literature included in this review. None of the included studies directly assessed vaccination uptake rates; intention to vaccinate was the most proximate outcome considered, but still may not accurately reflect actual vaccination behaviors; vaccination attitudes, knowledge, and hesitancy are even less closely tied to vaccine uptake. Lastly, the heterogeneity of the included studies -in terms of intervention type, targeted vaccine, study context, sample characteristics, and assessed outcomes -precluded any formal metaanalyses, and the narrative synthesis approach taken required subjective distillation and summarization of the included studies. Despite these limitations, this review's strengths include identifying communication strategies that should be avoided, scoping out the body of research in an underexplored area, and identifying key areas for future research.

Conclusion
Heightened vaccine hesitancy that has arisen with Covid-19 vaccines has put to the fore the need to address ubiquitous vaccine mis-and disinformation. This review provides evidence on some promising avenues for communication strategies for addressing misinformation, such as conveying the weight of evidence and scientific consensus around vaccines and related myths, utilizing humor, tailoring communications to specific target audiences, and incorporating warnings about encountering misinformation. Our findings question the effectiveness of commonplace interventions such as vaccine myths debunking, employing scare tactics, and communicating with certainty. There is an urgent need to develop and evaluate interventions to reduce hesitancy caused by mis-and dis-information and to directly measure their effect on the outcome of vaccine uptake, rather than assessing only distal outcomes such as knowledge or attitudes, in quasi-experimental and real-life contexts. Only then will it be possible to ensure that effective public health interventions such as vaccination benefit all.

Funding
This study is funded by the National Institute for Health Research (NIHR) Health Protection Research Unit in Vaccines and Immunisation (NIHR200929), a partnership between UK Health Security Agency and the London School of Hygiene & Tropical Medicine, and by the NIHR Health Protection Research Unit in Behavioural Science and Evaluation at University of Bristol, in partnership with UK Health Security Agency (UKHSA). The views expressed are those of the author(s) and not necessarily those of the NIHR, UK Health Security Agency or the Department of Health and Social Care.

Data availability
No data was used for the research described in the article.

Declaration of Competing Interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.